![]() |
|
General Hardware Talk about PCs/Macs, motherboards, CPUs, sound cards, RAM, hard drives, networking and everything else about computer hardware! |
![]() |
|
Thread Tools | Display Modes |
![]() |
#32 |
RIP Roxen
Join Date: Jul 2005
Posts: 28,267
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Didn’t seem to hurt performance much ![]() I’m having a hard time getting things stable. Continually getting WHEA BSOD which is tied to unstable vcore. I passed a bunch of cinebench runs and 2 hours of realbench yesterday, and now today it BSOD on the first run same settings. I bumped voltage up a ton and still BSOD. I’m at almost 1.28v under load now and it blue screened again after 5 runs. I’m thinking maybe my VCCIO or SA voltages are too low, but that would give me Memory related issues, not a vcore error. Will troubleshoot further when I get home. I might take the chip back and exchange it for a different one tomorrow before work. |
![]() |
![]() |
![]() |
#33 |
Radeon Evergreen
Join Date: May 2004
Posts: 1,629
![]() ![]() ![]() ![]() ![]() ![]() |
![]() I don't know what you have tried but figure I could shoot some suggestions your way. Set your Bios to unlock all limits. Try a manual setting of load line 6 because it's not as hot. I have my Vcore at 1.375 but it's showing up as 1.36 in HW monitor because of V droop but it's still stable. Set your cores to sync all cores and use a manual voltage for the CPU. If you are trying for 5.1Ghz I would suggest 1.33 V core and even then you will need to use a -2 AVX offset or you will run out of juice on the Prime 95 torture test. If you have problems with heat then you will have to back down. From what I hear, 5Ghz all core is not standard and personally, I would be happy with 5Ghz all core and a -1 AVX offset to be honest. I am seeing stability at 5Ghz all core and a -1 AVX offset using a Vcore of 1.27 Make sure your RAM is not pushed beyond it's normal XMP values until you get your CPU stable. Keep your VCCIO and system agent at 1.2 to start, you can whittle it down later. This will allow it to get enough power without burning your memory controller. I have no idea why ASUS has an XMP 1 value of like 1.4 VCCIO? Also, these chips are hot. Notice in the game shots I posted that my CPU is running 30% hotter than my graphic cards ![]() BTW, I'm not trying to be a know it all or undermine your knowledge. Just trying to be helpful. I struggled for stability more with my I9 9900K than any chip before it. Last edited by the_sextein : Jul 10, 2019 at 10:30 PM. |
![]() |
![]() |
Advertisement (Guests Only) |
Login or Register to remove this ad
|
![]() |
#34 |
RIP Roxen
Join Date: Jul 2005
Posts: 28,267
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Using manual voltage w/ speedstep all that power-save garbage disabled. It runs full-throttle/full voltage 24/7. I don't think this chip will run 5.1. I'm at 1.38v in the BIOS which gives me 1.376 idle and 1.323v under load. That is nuts ![]() At these settings (1.38v BIOS, 1.376 idle/1.323v under load) it seems to be stable, but the heat is nuts. I'm going to just back it down to 5Ghz and leave it for now, and consider taking the trip to Microcenter at some point this week to swap it out. It's really strange that Cinebench is the only thing that caused errors. Hours of BF5, Assassin's Creed: Odyssey, and Realbench testing. Oh well. And appreciate the tips! No worries. You've had the chip longer than me. I ordered an Alphacool Eisbaer 420mm cooler and cancelled it 3 times today. I was going to buy a block for the 2080TI and throw it on the loop but cancelled everything once I realized Alphacool doesn't make a block for the EVGA FTW3 Ultra.. Oh well. I'll have a Corsair H115i this weekend to swap this H100i with -- The 120->140mm fans and slightly larger rad should help a little. Edit: Dropped down to LLC6. 1.3v BIOS gives me 1.208v under load. Dropped down to 5Ghz w/ -1 AVX offset - can you believe it just did 20 passes at that voltage? Cinebench R15 is not AVX so that’s at 5Ghz the entire time. For some reason I’m having to jump up to 1.32v under load just for 100mhz.. that’s nuts. If it’s going to take that much voltage I’d rather just stay at 5Ghz. Edit2: Letting it run realbench overnight. 5Ghz w/ -1 offset @ 1.28v in BIOS with LLC 6. Gives me 1.26v idle/1.19v under load. 1.19v!! That’s so low. Temps are in high 60s-low 70s. It passed 30 runs of Cinebench 15 and 10 Cinebench 20. If I wake up and it did 7 hours of realbench I’m calling that stable. The real question is why am I jumping up so hard in voltage for only 100Mhz? I couldn’t get Cinebench stable until 1.323v under load @ 5.1. That’s over a .100v increase and I’m not even sure that’s completely stable, it only did about 10 runs. I had multiple cores pop 100c at that voltage. That’s also with a -1 AVX offset, so I’m wondering if I try to do 5Ghz w/ no offset, if I’m going to run into the same issue and have to fight the voltage demon. We’ll see. If this voltage is stable, I’m going to crank the cache up and try to get it 1:1 @ 5Ghz. Also, what’s your CPU PLL running at? Mine is 1.21v on the auto setting, might have some heat savings by dropping that slightly. You can find CPU PLL in the Tweakers paradise section of the BIOS. Aida64 reads the voltage so you can see what it’s running in Windows.
__________________
"If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe Last edited by Nunz : Jul 11, 2019 at 02:31 AM. |
![]() |
![]() |
![]() |
#35 |
Radeon Evergreen
Join Date: May 2004
Posts: 1,629
![]() ![]() ![]() ![]() ![]() ![]() |
![]() I went through the same thing. Temps and voltage are pretty good around 4.9/5Ghz but anything past that is just pushing it too hard. These chips are already pushed to the limit. If you can get 5Ghz with a -1AVX offset then I wouldn't take it back. You got a good chip. Load line 6 and a voltage of 1.26 to 1.29 should see you through with good temps and stability once you get everything set correctly in the bios. My Bios has an option to use XMP 1 or 2. XMP 1 disables all system limits which is what I am using and that is what I meant by disable all system limits. Windows power management should allow the chip to clock down when it's not needed. The chip should not be running at 5Ghz constantly on your desktop. It should fluctuate. In game it should stay at 5Ghz constantly though. Using the remove all limits XMP1 setting doesn't disable speed step but it allows the chip to run at constant boost speeds when the chip is under load. It requires a little more work but getting 300Mhz on 8 cores isn't bad considering it's 14nm+++ and once you get it stable you might be able to push non AVX mode a little higher with a little more juice if you use a custom loop. Like I said on the other page, anything past 5Ghz is like 10C heat increase for 100Mhz. I can get away with 5.2Ghz non AVX mode but AVX can't go over 5Ghz and I prefer 4.9Ghz just to be safe. 5.2Ghz with no AVX offset pushes my chip to 90C on a custom loop which isn't safe. If I ran a full 3D render or prime 95 at 5.2Ghz with no AVX offset it would probably throttle the chip. (No way I'm even going to try it.) I don't know what my PLL is off the top of my head but I didn't manually alter it. My chip droops a lot under load just like yours does. As long as it's stable that's all that matters. 1.28Vcore at 5Ghz with a -1 AVX offset sounds good to me and it sounds like you are getting things sorted. Once the sh!tty part is done, you can play some games and really appreciate what this monster can do. Last edited by the_sextein : Jul 11, 2019 at 07:05 AM. |
![]() |
![]() |
![]() |
#36 |
Radeon Evergreen
Join Date: May 2004
Posts: 1,629
![]() ![]() ![]() ![]() ![]() ![]() |
![]() When you are pushing your cache, run a fixed loop benchmark for a video game and watch the performance. On mine, performance stopped improving and even went down a little when I pushed the cache to 4.9Ghz. That is why I ended up settling on 4.7Ghz for the cache. I'd be interested to see if you get any improvements beyond that. I have not seen a lot of talk about real world improvements on the cache OC so let me know. |
![]() |
![]() |
![]() |
#37 |
RIP Roxen
Join Date: Jul 2005
Posts: 28,267
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() It ran 7 hours realbench overnight @ 1.28v BIOS/1.19v under load. Crazy! I'm old-school. I don't like or care for the speed changing/voltage changing stuff. Frankly, I wish I could flash the Boost feature off the new GPUs. I miss the old days. The extra voltage/clock-speed won't do much in terms of hurting the life-span of the CPU, especially when that higher voltage is with the chip idling. Constantly being under load will do more harm than higher idle voltages. I have all limits disabled with XMP2. XMP1 or 2 on my board only changes the memory timings. XMP1 is an Asus "tweaked" XMP setting that tries to tighten the timings beyond what the DRAM's XMP says. It was causing blue-screens in BF5 for me. XMP2 is just the standard DRAM XMP. I bumped cache up to 4.7Ghz and left everything else as is. We'll see how this goes! |
![]() |
![]() |
![]() |
#38 |
RIP Roxen
Join Date: Jul 2005
Posts: 28,267
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Pretty much tightened everything up and she's just about done. 5Ghz -1 AVX Offset @ 1.28v BIOS/1.19v under load. Cache @ 4.8Ghz (I may continue pushing it later on) Runs really cool and that's with the busted up H100i still. The H115i should give me slightly better thermals. I don't think I did the best TIM job either so that might help a bit as well. I'm considering sanding down the IHS as I have some big concaves in it, evident by the 10c difference in core temps at times. Core 5 always runs warmer. Cinebench R15 is even scoring well. Hit a 2170 @ 5Ghz w/ 4.8Ghz Cache. Highest I saw was 2200 flat when I was raping the chip @ 5.1Ghz w/ 4.5Ghz Cache. Running cool and fast ![]()
__________________
"If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe Last edited by Nunz : Jul 11, 2019 at 01:37 PM. |
![]() |
![]() |
![]() |
#39 |
Radeon Evergreen
Join Date: May 2004
Posts: 1,629
![]() ![]() ![]() ![]() ![]() ![]() |
![]() They really are a little more fussy then previous chips. I used to tweak the voltage and multiplier and just call it a day. I'm glad you got it under control, it's been a good stable chip for me and delivered the goods performance wise for the things I use it for. |
![]() |
![]() |
![]() |
#40 |
Radeon Arctic Islands
Join Date: Feb 2005
Location:
![]()
Posts: 5,658
![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() That's just about where I am at. I haven't screwed with the cache yet. I think I'll do that this weekend. I'm just happy with my temps at 5.0. I did get it to 5.1 but the temps were up there a bit too high for me. Plus I'm really happy with gaming under 70c and stress testing under 80c. This thing is night and day different than my 4770k. Games are smoother and have slightly higher frame rates. Stuff just generally runs better. About the only complaint that I have so far is that some Battlefield V models are rendering a bit funny from time to time. Not sure if that's a nvidia thing, a Battlefield V thing, or a overclocking thing. It's just kinda weird to see a face and hands floating around with no body. It doesn't happen all the time. Like once every hour or so. Just strange. Also, it's been like 3 to 6 months since I played it last. Maybe that is a new EA feature that is unlockable with $$$... |
![]() |
![]() |
![]() |
#41 |
RIP Roxen
Join Date: Jul 2005
Posts: 28,267
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() It’s a bug w/ BF5. Good ol’ DICE! Cache @ 4.8Ghz hard-locked Assassin's Creed after an hour or two of gaming. Dropped down to 4.7Ghz, and it went 4 hours gaming last night. I'm done tweaking. I'll be dropping the VCCIO voltage down a tad (at 1.05v, want to get it to 1.0v flat) and then that's it! Picking up the H115i tomorrow. This H100 has been so weird, wondering how much better the temps will be once I swap coolers out.
__________________
"If you can't handle me when I'm bulking, you don't deserve me when I'm cut." -- Marilyn Monbroe Last edited by Nunz : Jul 12, 2019 at 11:56 AM. |
![]() |
![]() |
![]() |
#42 | |
Radeon Arctic Islands
Join Date: Feb 2005
Location:
![]()
Posts: 5,658
![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Quote:
Good luck though, dude. |
|
![]() |
![]() |
![]() |
#43 |
space cadet
Join Date: Sep 2007
Location:
![]()
Posts: 26,366
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Thanks for the cinebench scores, single core is very close to 8700k. Can you run geekbench?
__________________
____________________ |
![]() |
![]() |
![]() |
#44 |
Radeon Evergreen
Join Date: May 2004
Posts: 1,629
![]() ![]() ![]() ![]() ![]() ![]() |
![]() Hey demo, here is the requested benchmark. ![]() If you want my opinion, I don't think you should upgrade. Most games don't support 8 cores and even when they do, it won't really matter unless your cores are maxed out which they won't be for a long while considering where the industry is headed. I think the PS5 will come out next year so you will see a little more support for it but with 12 threads and 5+Ghz speeds you should be fine. The I9 9900K overclocked will probably be 15% faster if I had to guess. https://www.youtube.com/watch?v=iwXK2rTpmgs Not bad for a $380 6 core chip from 2 years ago. If you overclock to 5.2Ghz your chip will be ahead of the game at web browsing, Windows office, gaming and general computing. I think you should wait till next year and see what AMD does with it's 7nm. If it doesn't improve by 30% then wait for Intel in late 2020 or early 2021. If you game at 4K and don't use SLI, you might as well not worry about it until Nvidia releases the 3080TI and possibly even the card after that before the GPU will be able to hit the CPU limitations from very old CPU's. I doubt a single 3080TI will be much faster than a 1080TI SLI setup and I am hitting CPU minimums on my current chip but it will take 2X 3080TI to hit the averages which will be years before a single card reaches that point with the 15% yearly improvements going on in the GPU industry. Last edited by the_sextein : Jul 12, 2019 at 11:20 PM. |
![]() |
![]() |
![]() |
#45 |
space cadet
Join Date: Sep 2007
Location:
![]()
Posts: 26,366
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Thanks, single core is almost identical - 6828 & 32596 Yeah I'll hold off, really want AMD tbh. Hope there's some fine wine to be had.. As for 1080TI SLI, I was contemplating adding a second but read TAA breaks SLI, and almost every new title uses TAA. ![]()
__________________
____________________ |
![]() |
![]() |
![]() |
#46 |
Radeon Evergreen
Join Date: May 2004
Posts: 1,629
![]() ![]() ![]() ![]() ![]() ![]() |
![]() Yeah dude, it sucks. Windows 10 full screen optimizations interfere. DX 12 interferes. TAA can work but it requires extra hassle and while SLI profiles can be built for DX11 there is no telling how much longer it will be supported. Also RTX hasn't worked in SLI on any game yet and those effects require hex editing the executable which is time consuming and annoying. It's not looking good to be honest. I think the gap to 8K is going to be large enough for consoles to catch up to the PC for the most part. Pretty pathetic. That's what happens when you sell chips with no improvements for thousands of dollars over and over again and people keep buying them. |
![]() |
![]() |
![]() |
#47 |
RIP Roxen
Join Date: Jul 2005
Posts: 28,267
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Sigh .. grabbed the H115i, set everything up, and the bracket studs must be longer on the H100i .. doesn’t contact the chip ![]() Gonna have to stop at Microcenter tomorrow.. |
![]() |
![]() |
![]() |
#48 |
Truth Doesn't Make Fact
Join Date: Jun 2001
Location:
![]()
Posts: 8,445
![]() ![]() ![]() ![]() |
![]()
__________________
| Fractal Design Define R5 | ASUS Crosshair VI Hero | AMD 3700x w/ H100i v2 | 16GB G.Skill Flare-X | Aorus 1080Ti Waterforce Extreme Edition | "Don’t waste your time on jealousy. Sometimes you’re ahead, sometimes you’re behind. The race is long and, in the end, it’s only with yourself." |
![]() |
![]() |
![]() |
#49 |
That postcount though...
Join Date: Apr 2003
Location:
![]()
Posts: 56,283
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() You happy with 9900K or do you want to swap out to an AMD build? |
![]() |
![]() |
![]() |
#50 |
RIP Roxen
Join Date: Jul 2005
Posts: 28,267
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() I’m happy. No plans to swap out. |
![]() |
![]() |
![]() |
#51 |
Radeon Evergreen
Join Date: May 2004
Posts: 1,629
![]() ![]() ![]() ![]() ![]() ![]() |
![]() I just watched the Gamers Nexus Review on YouTube and it pretty much mirrors what Toms Hardware was showing in regards to gaming. The I9 9900K gains huge advantages from Overclocking/high frequencies. Often beating the 3900X from 10 to 45FPS in games and in all of those situations the I9 9900K is GPU bound. This means that when Nvidia releases a new GPU in the future you will see the I9 9900K pull ahead even further. As someone who uses SLI, my GPU bottlenecks are far removed from what he is getting which pushes the system to the CPU minimum bottlenecks at 4K even. My only argument with their findings overall would be his comment that while in many instances cuda processing is better for 3D rendering there are still situations where CPU rendering is better. That depends, GPU rendering can use multi GPU configurations without the need for SLI support. For example, you could put a 2070, a 2080TI, and a 1080TI in the same system and Daz studio would use 100% of all 3 cards to render. I have a triple GPU setup and I have been 3D rendering for a few years now. Under no circumstance have I found CPU rendering to be even 1/5 the speed of GPU rendering. Usually my renders are 10X faster than the CPU. It's possible that I personally do not benefit from and am not aware of these instances but in my experience their is no benefit from having higher CPU rendering capability. If I were to use 3Dlite instead of Iray then yes but I see no real advantage in doing so. You could simply throw your last generation GPU from an old build in your rig with your new card and it will outperform the 3900X by a huge margin in the 3D rendering without costing you a penny. The I9 9900K is still the better CPU for general windows usage, Microsoft office, Adobe premier and Photoshop, web browsing and gaming by a large margin. There is nothing to be upset about regarding the I9 9900K since it's a top performer at the things that the mainstream uses CPU's for and it's been on the market for 9 months and still bashing heads with a price that will be dropping near the $399 mark very soon. It's also on a very mature and stable platform. There are advantages to be had for content creator hobbyists out there with higher core counts and the 3900X delivers better content creation at a price that demolishes the high end platform chips you used to have to buy in order to get performance on that level. In things like video encoding, the review shows up to 40% higher performance on the 3900X vs the I9 9900K. You know I'm content where I sit especially since I've had this chip since last year. For what I do, the I9 9900K provides the best performance you can get. With the core counts of the next gen consoles being limited to 8 and Intel's high frequency stance and lower core count limitations of it's current architecture, I don't see games needing more cores for a while so I don't see things changing in the near future. Especially since AMD's 3900X is the only chip in it's linup that delivers more than 8 cores unless you count the $750 3950X. I don't see game developers wasting time on that for the next 5 years. The 6 core 8700K is doing just fine as you can see and while it may suffer a 15% disadvantage at console games that use 8 cores in the future, it will still be performing fine and the I9 9900K will be in great shape for the next 4 years when it comes to gaming and general everyday computing if I had to guess. It will be a setup with legs so to speak because improvements are slowing to a crawl on the PC. AMD is going to be squeezing every drop of performance out of their existing setup and Intel is still a year out or more from releasing anything new so they will be wringing out every last drop on their current setup as well. I wouldn't expect the PC landscape to change too much in the immediate future. Maybe in a few years we will see some large gains and if Nvidia picks up the pace maybe it will pay off. Here is to hoping. Last edited by the_sextein : Jul 14, 2019 at 02:04 PM. |
![]() |
![]() |
![]() |
#52 |
Radeon Arctic Islands
Join Date: Feb 2005
Location:
![]()
Posts: 5,658
![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() I think I'm set for a while. I love this 9900k. It's chewing through a ton of stuff. I've noticed a lot of improvements in games. So much so that I can't stop playing them to report back here on them. Once I got this stable at 5.0 on all cores I'm just burning through titles. Sure beats the 4770k that I had. |
![]() |
![]() |
![]() |
#53 | |
Master Troll
Join Date: May 2004
Location:
![]()
Posts: 44,335
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Quote:
__________________
By your retarded posts i can only conclude you haven't been spanked enough. yoz |
|
![]() |
![]() |
![]() |
#54 | |
Raiding Curio's Stable
Join Date: Mar 2004
Location:
![]()
Posts: 13,392
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Quote:
Extremely lazy "oc", but the system is stupid fast. ![]() |
|
![]() |
![]() |
![]() |
#55 |
RIP Roxen
Join Date: Jul 2005
Posts: 28,267
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() 5ghz all core on the 9900k is easy. after that is where it's difficult. its a cool running chip at 5ghz. once the voltage starts kicking up, that's when the heat gets overwhelming |
![]() |
![]() |
![]() |
#56 | |
Radeon Arctic Islands
Join Date: Feb 2005
Location:
![]()
Posts: 5,658
![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Quote:
The heat really isn't that bad with a AIO hooked up to it. At stock I was running in the upper 50C range on stress tests. |
|
![]() |
![]() |
![]() |
#57 |
Hallowed are the Ori
Join Date: Apr 2003
Location:
![]()
Posts: 24,177
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() ![]() Thinking of this setup to last me for the next 5 years, then will replace my 1660ti next year with the top end Nvidia card. As I want to wait for Ray Tracying to mature and the next gen cards to have more hardware for it. I play 2560x1080 144Hz so I'm pretty sure I'm going to be CPU limited with my 4770k at 4.3GHz. Or maybe I'll just keep what I have now and do the entire PC in one go... I notice stuttering in various games like Assassins Creed as my CPU is basically 100% on all cores.
__________________
My Twitch Channel Unbiased Gaming! PS4/PC Streaming - Streaming PS Indie Titles + Infamous, Metal Gear, and Killzone Shadow Fall Fantards the scourge of the universe: |
![]() |
![]() |
![]() |
#58 |
RIP Roxen
Join Date: Jul 2005
Posts: 28,267
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() I’d wait and see what Intel puts out, especially if you’re rocking a 1660TI |
![]() |
![]() |
![]() |
#59 |
Hallowed are the Ori
Join Date: Apr 2003
Location:
![]()
Posts: 24,177
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]()
__________________
My Twitch Channel Unbiased Gaming! PS4/PC Streaming - Streaming PS Indie Titles + Infamous, Metal Gear, and Killzone Shadow Fall Fantards the scourge of the universe: |
![]() |
![]() |
![]() |
#60 |
Raiding Curio's Stable
Join Date: Mar 2004
Location:
![]()
Posts: 13,392
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Probably mid 2020 with midrange cards. They eventually plan to release higher end cards, but I'm guessing those are going to be 2021. Rumors are a big navi is coming out sooner than that, which may lower pricing a bit on the Nvidia side as well. |
![]() |
![]() |
![]() |
Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
Thread Tools | |
Display Modes | |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Best 280 cooler for 9900K? | Och | General Hardware | 11 | Feb 4, 2019 03:07 PM |
Help me build a new PC please | Knees Up MotherBrown | General Hardware | 24 | Aug 16, 2009 01:22 AM |
My i7 Build - Finally Done, ready to build. | FX-Overclocking | General Hardware | 42 | Dec 15, 2008 08:29 PM |
Build my computer RAGE3D!!! (A newb's 1st build) | Deftones | General Hardware | 10 | Jun 17, 2004 06:55 PM |
new build | rasputin uk | Operating Systems | 5 | May 4, 2002 02:21 PM |