Announcement

Collapse
No announcement yet.

AMD 4th Generation APU Reviews ("Kaveri")

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • metroidfox
    replied
    Originally posted by The Luggage View Post
    Except the fact that the 28nm process that GF and TSMC use is more equivalent to Intel's 32nm than it is their 22nm. As for the next process node, you can't compare tapeout to actual production. If you want to go that route, Intel taped out 14nm two years ago.

    http://www.fudzilla.com/home/item/28...ell-chip-works

    Intel's competitors aren't even producing 20nm wafers yet (equivalent to Intel's 22nm process). So yeah, Intel is years ahead.

    To be fair, IBM is ahead of Intel's processing capabilities, despite the fact that most normal people don't get to use the processors. They were trying to help ARM get down to 14 nm in 2011--a full year before Intel taped out 14 nm.

    Leave a comment:


  • The Luggage
    replied
    Originally posted by Sound_Card View Post
    They are not "years" behind Intel. A myth that seems to be perpetuated over and over. TSMC has been taping out designs on 16nm right now, and GF is on track with its 14nm production by the end of this year. When does Broadwell come out? Oh thats right, suspected end of this year.

    Did you already forget that TSMC has gotten their 40nm and 28nm out to market BEFORE Intel's 32nm and 22nm chips? Their lead is shrinking fast every year. A lead none the less however.
    Except the fact that the 28nm process that GF and TSMC use is more equivalent to Intel's 32nm than it is their 22nm. As for the next process node, you can't compare tapeout to actual production. If you want to go that route, Intel taped out 14nm two years ago.

    http://www.fudzilla.com/home/item/28...ell-chip-works

    Intel's competitors aren't even producing 20nm wafers yet (equivalent to Intel's 22nm process). So yeah, Intel is years ahead.

    Leave a comment:


  • Sound_Card
    replied
    Originally posted by The Luggage View Post
    Huh? 22nm was already a generation ahead of the competition and Intel is already manufacturing wafers on 14nm. 10nm development is on schedule and the process is actually looking more robust than the 14nm process. The other companies are years behind Intel.
    They are not "years" behind Intel. A myth that seems to be perpetuated over and over. TSMC has been taping out designs on 16nm right now, and GF is on track with its 14nm production by the end of this year. When does Broadwell come out? Oh thats right, suspected end of this year.

    Did you already forget that TSMC has gotten their 40nm and 28nm out to market BEFORE Intel's 32nm and 22nm chips? Their lead is shrinking fast every year. A lead none the less however.

    Leave a comment:


  • The Luggage
    replied
    Originally posted by Sound_Card View Post
    Seen a couple hints they are bowing out of the CPU market. Though they deny it. I just don't see them ever catching up. We can't even blame the process lead anymore. Intel is not really that far ahead of GF or TSMC in process tech. They are all three hitting the same wall and I suspect they along with IBM will have to collaborate on what to do next.

    Intel simply has a much better IPC despite the process lead. My old Q9300 (45nm) is only ever so slightly slower than AMD's Pile Driver. Thats bad.
    Huh? 22nm was already a generation ahead of the competition and Intel is already manufacturing wafers on 14nm. 10nm development is on schedule and the process is actually looking more robust than the 14nm process. The other companies are years behind Intel.

    Leave a comment:


  • ASCI Blue
    replied
    I would argue against that. Get the APU now and wait on a discreet card. In theory, for the more mainstream user, it should double the life of the system by saving the cost of a GPU up front and waiting for the generation after to drop something in.

    As of now my 7850 apu handles everything peachy and playably that I've thrown at it. It probably can't do Crysis or Farcry in 1080 @max everything but it does lovely with medium max 1080 Diablo 3 and Titanfall.

    Leave a comment:


  • noko
    replied
    Originally posted by Redeemed View Post
    All of this seems so familiar. The doom and gloom reported for AMD.

    Honestly I think they're taking the silent approach for a reason. I honestly suspect that AMD is likely going to bow out of the CPU business, and focus on APUs and GPUs only. We haven't had a new CPU or even a refresh from AMD in forever, and no news of anything new in the pipeline. However they seem to be making plenty of noise regarding their APUs. Not only that, but their GPUs are doing good. The 290 series was a bit underwhelming with how loud and hot they run, and the CF support seems lacking from what I've been reading.

    However for single card setups they seem alright.

    I don't think AMD has been sitting idly on they haunches. I'm pretty sure they've a solid next-gen GPU up their sleeves to compete with Maxwell. Likely designed for 20nm. Now, might they scale it back some to release a 28nm version? Possibly. Maybe they're waiting to see if nVidia will release a 28nm Maxwell.

    For CPUs I think AMD is done, pretty sure they're giving up on that front. Focusing on APUs. This is probably their best bet, imo.

    It also seems that instead of funneling cash into R&D for future products, they're spending a lot of resources on developer relations, which I don't think is bad. Get in with some of the big developers, then get back to pumping money into R&D. It may prove to be a sound strategy.

    May. Wonder how long it'll take till we find out.
    I will be all interested in anything release but it has to be released first. My needs in a laptop are more basic but next one will have to have a stellar screen first. I have time and in no hurry.

    Since GPUs are advancing faster then CPU's, APU's have a handicap - the GPU gets outdated fast which to upgrade just the cpu becomes important. In my Llano case the GPU portion is pointless now and the cpu is supplying a Nvidia 750Ti which it falls short in a nutshell.

    Leave a comment:


  • Sound_Card
    replied
    Seen a couple hints they are bowing out of the CPU market. Though they deny it. I just don't see them ever catching up. We can't even blame the process lead anymore. Intel is not really that far ahead of GF or TSMC in process tech. They are all three hitting the same wall and I suspect they along with IBM will have to collaborate on what to do next.

    Intel simply has a much better IPC despite the process lead. My old Q9300 (45nm) is only ever so slightly slower than AMD's Pile Driver. Thats bad.

    Leave a comment:


  • Redeemed
    replied
    All of this seems so familiar. The doom and gloom reported for AMD.

    Honestly I think they're taking the silent approach for a reason. I honestly suspect that AMD is likely going to bow out of the CPU business, and focus on APUs and GPUs only. We haven't had a new CPU or even a refresh from AMD in forever, and no news of anything new in the pipeline. However they seem to be making plenty of noise regarding their APUs. Not only that, but their GPUs are doing good. The 290 series was a bit underwhelming with how loud and hot they run, and the CF support seems lacking from what I've been reading.

    However for single card setups they seem alright.

    I don't think AMD has been sitting idly on they haunches. I'm pretty sure they've a solid next-gen GPU up their sleeves to compete with Maxwell. Likely designed for 20nm. Now, might they scale it back some to release a 28nm version? Possibly. Maybe they're waiting to see if nVidia will release a 28nm Maxwell.

    For CPUs I think AMD is done, pretty sure they're giving up on that front. Focusing on APUs. This is probably their best bet, imo.

    It also seems that instead of funneling cash into R&D for future products, they're spending a lot of resources on developer relations, which I don't think is bad. Get in with some of the big developers, then get back to pumping money into R&D. It may prove to be a sound strategy.

    May. Wonder how long it'll take till we find out.

    Leave a comment:


  • noko
    replied
    Nvidia Maxwell looks like it will dominate in the notebook discrete arena due to performance/watt, I don't see AMD having anything there. APU wise Intel has the lower power unless AMD pulls a rabbit out of the hat which will just continue the domination by Intel. Can AMD have an APU only gaming laptop, by definition I don't think AMD APU would be powerful enough to be considered a gaming laptop. For Ultra-books where performance/power/battery life is king AMD?

    Kaveri is just too late to the game with laptops no-where to be found with current crop of Kaveri APU's nothing spectacular and over priced for the performance. If AMD had some low power 6 core versions or 8 with some CPU muscle and DDR4 ram support and motherboards that would be very interesting. I doubt that will happen this year.

    Leave a comment:


  • Sound_Card
    replied
    The CPU is severely holding back the GPU. Iris Pro is right there with it because of the CPU IPC and eDRAM from H-Well. When B-Well comes this year, I'm not sure how AMD is going to be viable in the notebook markets (and to some good extent, Nvidia). The graphics is going to be nearly 50% faster than Iris Pro. Already as it is, it is very difficult to find notebooks equipped with with Radeons. I suspect Nvidia giving away their mobile GPUs, and Radeons have no chance being paired with Intel CPU's when Intels GPU is nearly just as fast or fast enough.

    Leave a comment:


  • noko
    replied
    Originally posted by ASCI Blue View Post
    Not sure this would work, unless it would work with two discreet GPUs to begin with. In theory doesn't everything get shuffled to a single output with the second GPU acting as additional power for the first?

    Bios settings ask which is the primary where we can choose onboard or PCI.
    It should work with the game on the primary monitor using the discrete GPU. Still I just can't justify AMD APU for my setup due to price/performance/power. For a laptop with a very nice display AMD APU will be very interesting. If there was some mITX mobo that are AM3+ I could use a FX8350 (downclocked, undervoltage) which would be interesting but still powerful.

    Leave a comment:


  • ASCI Blue
    replied
    Originally posted by noko View Post
    I could have two monitors, one being driven by the APU and the other by the discrete card, wonder how that would work out? Double the fun or double the trouble?
    Not sure this would work, unless it would work with two discreet GPUs to begin with. In theory doesn't everything get shuffled to a single output with the second GPU acting as additional power for the first?

    Bios settings ask which is the primary where we can choose onboard or PCI.

    Leave a comment:


  • noko
    replied
    That is good to know, get the hardware working on other things besides just graphics. Otherwise it becomes just a waste of space.

    Leave a comment:


  • caveman-jim
    replied
    No, trueaudio will still work on an APU with any discrete GPU installed as long as the APU GPU is still enabled (some BIOS may disable the APU GPU when it detects a dGPU installed).

    Leave a comment:


  • noko
    replied
    Originally posted by caveman-jim View Post
    Kaveri notebooks haven't been announced yet, hence you haven't heard about availability. Telling you when you'll hear about when you'll hear about it isn't a good strategy, it seems to me... it's coming soon (tm)
    Well I have time, no hurry yet. June will be time I will be more serious about a new more powerful laptop. I am still using my Brazo 350 and like it except the screen is rather ugly.

    How about TrueAudio and a Nvidia discrete graphics card, as in Kaveri and a Maxwell GPU? Is AMD locking use of TrueAudio with other folks graphics card?

    Leave a comment:


  • caveman-jim
    replied
    Kaveri notebooks haven't been announced yet, hence you haven't heard about availability. Telling you when you'll hear about when you'll hear about it isn't a good strategy, it seems to me... it's coming soon (tm)

    Leave a comment:


  • noko
    replied
    Originally posted by caveman-jim View Post
    Dual Graphics is the first of the upgrade options for an existing APU platform, not the ultimate. You could substitute an FX-6300 or Athlon X4 760K and get the same performance and value. That article was set up to create a headline, not to do a reasonable comparison - if you take the logic to the ultimate conclusion you should buy a last gen used platform from a verified buyer on ebay instead of a new cpu/mobo/gpu etc., just like used cars are better value than new cars.

    Adding an R7 250 to a Kaveri will give you better performance than adding an R7 250 to an intel CPU of the same price, or adding a GeForce to either APU or Intel CPU.

    Any given price point of a new product you can break that cost down into a different set of components that for one particular task would be better, this has always been true.
    Article was for cheap gaming or gamers - not important. Yes I understand AMD stance on APUs, Dual graphics gives an option for a viable upgrade.

    I am still debating if I go with an APU laptop (no discrete GPU), it will need good graphics but not for gaming (maybe some older games that will play well) and powerful enough cpu. Which I think for business, 3d work (modeling portion), photo editing Kaveri will do very good.

    Or more faster CPU as in I5 or faster with a Maxwel discrete GPU, Intel GPU when on battery for longer battery life (works well with Intel stuff) and the Maxwell for all above plus gaming. This would be overkill but fun none the less.

    There is just no news, even a recent hint on availability of Kaveri laptops, performance etc.

    I will want a minimum of 1080p and preferably IPS. 15"-16". Touchscreen would be OK but not mandatory. 1440P would be nicer.

    I also found the availability of mITX motherboards for the FX line utterly lacking, getting a AMD CPU for a GF 750Ti in that form factor leaves APU's only option which basically over half the chip will not be used. Can TrueAudio be used with a Nvidia card doing graphics? What other programs can use the GPU of the Kaveri chip while having a Nvidia card?
    I could have two monitors, one being driven by the APU and the other by the discrete card, wonder how that would work out? Double the fun or double the trouble?

    Leave a comment:


  • caveman-jim
    replied
    Dual Graphics is the first of the upgrade options for an existing APU platform, not the ultimate. You could substitute an FX-6300 or Athlon X4 760K and get the same performance and value. That article was set up to create a headline, not to do a reasonable comparison - if you take the logic to the ultimate conclusion you should buy a last gen used platform from a verified buyer on ebay instead of a new cpu/mobo/gpu etc., just like used cars are better value than new cars.

    Adding an R7 250 to a Kaveri will give you better performance than adding an R7 250 to an intel CPU of the same price, or adding a GeForce to either APU or Intel CPU.

    Any given price point of a new product you can break that cost down into a different set of components that for one particular task would be better, this has always been true.

    Leave a comment:


  • noko
    replied
    Dual graphics review pitting an I3 against Kaveri using dual graphics. Performance per buck is just not in AMDs corner. I agree with the article the price of Kaveri APUs are just too high and the corresponding GCN cards that can play along with them a.k.a R7 250 are too expensive. The I3 and R7 260x blows away anything the AMD APU can dish out and for a cheaper price?

    http://www.extremetech.com/gaming/17...-budget-gaming

    Carrying that over to laptops (my future interest), combined with Intel cpu and a Maxwell GPU I am not sure how AMD can compete. Why are Kaveris so expensive?

    Leave a comment:


  • DiaperJe|\|i3
    replied
    Very cool. Its amazing the difference it makes.

    Leave a comment:


  • noko
    replied
    Any word when the laptops with Kavari going to hit plus the lower desktop versions particularly the 45w ones?

    Best Dual graphic match ups for the different skews as well.

    Leave a comment:


  • oozz77
    replied
    Hi Revan,

    I took a look at the game site and it truly is excellent and would be another prime candidate for a Mantle patch.

    A few friends also saw the video clip and also agrred that is looks like a great game in the making especially being first person!!

    I hope they get the finance needed and I do hope someone from AMD does have a look as well!

    Leave a comment:


  • DiaperJe|\|i3
    replied
    I picked up a A8-4500m last year for under $500. My primary criteria were 14" screen and a quad core. I've been pleasantly surprised and impressed by the purchase. I've been playing through Dragon's Age : Origins and Dead Space with settings all on high (1366X786). I was even able to play a bit of Battlefield 3 with settings on low. Not silky smooth 60fps frame rate but quite playable. I'm very anxious to see what Kaveri can deliver to the mobile masses.

    Leave a comment:


  • Crisler
    replied
    Originally posted by dampflokfreund View Post
    AMD should get as many developers as AMD can to Mantle.
    Not going to disagree but they need to do some themselves. AMD traditionally has not put skin in the game. They have developed cool technology and then hand it to developers and wait for them to do something with it, this approach has just never worked.

    When it is pointed out to them that they need to be aggressive like NVidia with these technologies the reason given is lack of money. Okay that's cool so stop funding overclocking competitions that show records but the chips still slower than the competition and use that money for some direct development.

    AMD NEEDS to get directly involved and quit waiting for others to do the work. THEN this technology will take off.

    Leave a comment:


  • dampflokfreund
    replied
    Originally posted by Crisler View Post
    I think the biggest issue facing AMD right now with Kaveri is the ability to deliver on the vision.

    Right now, with no future tech enabled Kaveri is a decent chip. The A10 are overpriced but the A8 is a solid choice, however it is only decent.

    The true power of Kaveri is when the future tech kicks in and this is where AMD has to step it up. If we do not see Mantle or True Audio or even HUMA in the market soon and in a bigger way than one or two titles then the future of Kaveri might not be so bright.

    Jim, we all know the AMD response that this is now in the hands of developers but that is not an approach AMD can risk. Recall back to the days when AMD announced working with Bullet. I sat in on a call where we where told by AMD execs that Bullet would soon take off and the vision AMD was pushing would be a reality. Well we all know that the only thing Bullet shot was AMD in the foot, it is not nor looking to be any kind of majorly used product.

    This passive approach by AMD is why it has lagged behind in my opinion. True the competitions various features are not pushing mainstream either but they actually made an impact in the market. AMD has got to get off the bench and not rely on others to make their product shine, they need to do it themselves.

    Now all that being said I still stand behind my earlier statement, if you are buying a new computer for general use and light gaming it makes no sense to get a Intel, the Kaveri is a clear choice.

    As for the future, we will see. AMD has a pitch that is a little low on the outside edge of the plate that is floating, will they slap it for a home run or bunt?
    AMD should get as many developers as AMD can to Mantle.

    Leave a comment:


  • bittermann
    replied
    Originally posted by noko View Post
    DDR3 or DDR5?

    For Kavari will having the DDR5 version make any significance? Will it work?

    Really this launch AMD just is not giving enough info out, the reviews do not go into this at all in any major way plus the rather limited skews of Kavari chips. I am not sure system builders will be that interested due to price. This is becoming a let down.
    I have the DDR5 version. Whether DDR3 or 5 is better is a good question???? So little information on this its scary...

    Leave a comment:


  • noko
    replied
    Originally posted by bittermann View Post
    I think the best match for the top Kaveri is a 250. I have that card as I recently purchased it with a Xmas present Best Buy card. Quite surprised at what it can handle. Overall its about equivalent to a 7750 or slightly slower on older games.
    DDR3 or DDR5?

    For Kavari will having the DDR5 version make any significance? Will it work?

    Really this launch AMD just is not giving enough info out, the reviews do not go into this at all in any major way plus the rather limited skews of Kavari chips. I am not sure system builders will be that interested due to price. This is becoming a let down.

    Leave a comment:


  • bittermann
    replied
    Originally posted by Crisler View Post
    The graphics in the top three APUs appears to be the same with the only difference being a little bit of speed at the top so I am betting the 240/250 are the same for all.
    I think the best match for the top Kaveri is a 250. I have that card as I recently purchased it with a Xmas present Best Buy card. Quite surprised at what it can handle. Overall its about equivalent to a 7750 or slightly slower on older games.

    Leave a comment:


  • Crisler
    replied
    Originally posted by oozz77 View Post
    Hi caveman-jim,

    I've been reading all I can about the new Kaveri APU's and came across one or two sites where they have already matched up in DG with a 7750 (both DDR3 and DDR5 versions)!

    Can you confirm whether there is any benefit in going DG with the DDR5 version?

    Also from the reading I have done, AMD say the best match is the new 240 or 250 add in cards so can you confirm if the these cards will have TrueAudio just like the 260 version and would the top A10-7850 work with the 260?
    The graphics in the top three APUs appears to be the same with the only difference being a little bit of speed at the top so I am betting the 240/250 are the same for all.

    Leave a comment:


  • oozz77
    replied
    Hi caveman-jim,

    I've been reading all I can about the new Kaveri APU's and came across one or two sites where they have already matched up in DG with a 7750 (both DDR3 and DDR5 versions)!

    Can you confirm whether there is any benefit in going DG with the DDR5 version?

    Also from the reading I have done, AMD say the best match is the new 240 or 250 add in cards so can you confirm if the these cards will have TrueAudio just like the 260 version and would the top A10-7850 work with the 260?

    Leave a comment:


  • bittermann
    replied
    Originally posted by Crisler View Post
    Now all that being said I still stand behind my earlier statement, if you are buying a new computer for general use and light gaming it makes no sense to get a Intel, the Kaveri is a clear choice
    *sigh*

    AMD keeps pushing out future tech that needs future software to take advantage of it. The problem with that is there are no certainties in this. When Mantle comes out lets hope the game YOU want to play will support it. Its all a guessing game at this time. These new APU's would be great around the $100 mark but not at their current prices. And when they do get lower prices by then there will be something newer coming out.

    Leave a comment:


  • Crisler
    replied
    I think the biggest issue facing AMD right now with Kaveri is the ability to deliver on the vision.

    Right now, with no future tech enabled Kaveri is a decent chip. The A10 are overpriced but the A8 is a solid choice, however it is only decent.

    The true power of Kaveri is when the future tech kicks in and this is where AMD has to step it up. If we do not see Mantle or True Audio or even HUMA in the market soon and in a bigger way than one or two titles then the future of Kaveri might not be so bright.

    Jim, we all know the AMD response that this is now in the hands of developers but that is not an approach AMD can risk. Recall back to the days when AMD announced working with Bullet. I sat in on a call where we where told by AMD execs that Bullet would soon take off and the vision AMD was pushing would be a reality. Well we all know that the only thing Bullet shot was AMD in the foot, it is not nor looking to be any kind of majorly used product.

    This passive approach by AMD is why it has lagged behind in my opinion. True the competitions various features are not pushing mainstream either but they actually made an impact in the market. AMD has got to get off the bench and not rely on others to make their product shine, they need to do it themselves.

    Now all that being said I still stand behind my earlier statement, if you are buying a new computer for general use and light gaming it makes no sense to get a Intel, the Kaveri is a clear choice.

    As for the future, we will see. AMD has a pitch that is a little low on the outside edge of the plate that is floating, will they slap it for a home run or bunt?

    Leave a comment:


  • caveman-jim
    replied
    Originally posted by noko View Post
    Just seems AMD is moving slow here - no laptop chips yet, only the A10 available . . .
    When have you ever seen a top to bottom launch in notebook and desktop in the same day?

    AMD launched the desktop parts first because they could, if the notebook parts had been launched you still would not have seen them until the OEM's refresh their line ups. When it's notebook refresh time, count on AMD showing up to the party.

    Leave a comment:


  • caveman-jim
    replied
    Originally posted by The Luggage View Post
    Sorry, but for most of the people that buy cheap computers, the graphics do not matter. That's just how it is.
    How do you come to this conclusion? This is contrary to all industry analyst thinking, and design trends (if GPU does not matter, why are Intel dedicating more and more die space each generation to GPU?).

    Leave a comment:


  • noko
    replied
    Don't forget about TrueAudio which I hope goes beyond games into programs accessing the hardware for video playback. Laptops I can't wait to see Kavari hit too. Just seems AMD is moving slow here - no laptop chips yet, only the A10 available . . .

    Leave a comment:


  • Crisler
    replied
    Originally posted by The Luggage View Post
    Sorry, but for most of the people that buy cheap computers, the graphics do not matter. That's just how it is.
    Seriously, dude you do not know many gamers then. The majority of gamers are no geeks or gear heads like us, they are everyday people that just want to play their games. The reason Intel integrated graphics and older graphics cards make up the vast majority of STEAM is because these people just want to play their games.

    Now offer those same people a choice of two cheap computers, OEM builds, one has a TON better graphics performance in their games, they will buy that system.

    Luggage I understand it is easy for us geeks to be out of touch with the real world when it comes to computing.

    Leave a comment:


  • Redeemed
    replied
    Originally posted by The Luggage View Post
    How about you properly quote me?
    No I quoted you correctly. The GPU is used for far more today than just gaming, even by those doing the most "simplistic" of tasks. To the point that even an almighty low-end CPU alone from Intel would not be able to replace a proper GPU. This is why I referrenced them ramping up their IGP business so heavily. The top-end CPUs are not the only chips from Intel to sport an IGP. Even their low-end offerings have them. Why? If the GPU doesn't matter?

    Everything from desktop composition to web pages to DVD playback to youtube videos to... the list goes on, benefits from what GPU you're using. And that fact will only grow stronger as time progresses. Intel sees this and is trying to adjust for it.

    The APU is a threat to Intel whether you like to admit it or not. And the x86 cores are not what Intel is afraid of...
    Last edited by Redeemed; Jan 19, 2014, 01:08 PM.

    Leave a comment:


  • noko
    replied
    Now I reflect on the two 8core versions of AMD apu, PS4 and XBone, with XBone having imbedded ram. I would think a variation of the XBone apu would be an ideal server chip when programmed to take advantage of it's full potential of using the super fast internal memory. I think AMD is in position to really turn things upside down with servers.

    I am very impressed with the 45w A8 but scratch my head with the A10's . I may pick up an A8 for my Commodore rig which isn't really being used, that should do the trick in getting performance up while keeping heat down. The current 65w trinity is still too hot meaning I have to underclock it hitting the performance too much .

    Leave a comment:


  • noko
    replied
    Did anyone notice when the full computing strength of AMD apu and Intel apu was tested as in both cpu/gpu is used AMD creamed Intel? When all processing power of both Intel and AMD was used, AMD was over 200% faster over an Intel I7 4770K!
    ComputeBench
    http://www.anandtech.com/show/7677/a...0-a10-7850k/14

    Does anyone not believe that the new generation of consoles that the developers will not start programming using all of the console apu power?

    I think the question is not if but when will HSA really be used. I say it is being used now and will be greatly used rapidly. In what area? Most graphically stuff as in games at first. Now consider the bench above and consider servers? Workloads for servers? If games in the future will be streamed? etc. The 45w A8 7600 trumps that I7 4770K when the whole processing power of the chip is used.

    Leave a comment:


  • The Luggage
    replied
    Originally posted by Redeemed View Post
    For most people graphics don't matter?

    Sooo... why is Cirrus Logic not the premier video card manufacturer? If graphics didn't matter I'd figure they'd be making a killing right about now.


    Also, I suppose if graphics don't matter and HSA is irrelevant, Intel must just loving tossing money away what with all their focus on integrated graphics and what not. Yeah... that's it.
    How about you properly quote me?
    for most of the people that buy cheap computers, the graphics do not matter

    Leave a comment:

Working...
X