Radeon 128mb fact or fiction?!?!?!

Nivida announced that the geforce will come in 64mb and 128mb version. And I think that many will buy the 128mb version becuse some paople want the best and its cool to hva so powerful card. Why shouldnt radeon2 have 128mb, I think they will have it. So what if it cost much wait 4 months and the price will be half as much. I bought mine radeon 64mb VIVO for 400 dollers and 3 months later it
costs 200 dollors. Soon the games will catch up to the graphical teachnolegy.
 
3DFX vodoo 6000 wasn't really a 128 mb chip it was
4 32mb chips correct me if I am wrong but I think that
I have read it somewhere.
And the vodoo5500 is 2 32mb chips =64mb
 
Yep. It's also the same thing with the ATI Rage Fury MAXX. It's a 64 MB adapter consists of 2 ATI Rage 128 Pro chips, sharing 32 MB of memory for each chip.
 
It isn't size, it's bandwidth that's important.

The extra 64MBs won't be used anyway.

Just because it has impressive specs doesn't mean it will perform that well. Nvidia is really good at misleading people like that.

Even now most games don't use the extra 32MBs of RAM. It's only useful if you're running at high resolutions like 1600x1200@85, which incidentally is what I'm running on my 21" Sony CPD-G500.

Games are typically 2 to 3 years behind the technology. Unless it's designed by Carmack.

No video card uses 128MB chips. They're typically 8MB or 16MB chips.

All of those multiple graphics chip boards you're refering to never "shared" RAM, they were each mapped to their own multiple 8MB RAM chips.
 
Be ReaL!

Be ReaL!

A video card that has an upgradeble GPU and VRAM was my idea few years back. I asked my self the same question - y cant the companies make an upgradeble VBoard. There were alot of reasons that came almost imediatly to my mind, but one came to me about 6 month after my intencive resrch on that subject - COST. Cost - NOT TO THE CONSUMERS, but to the manifactuires:

just think how much money will the manufactures LOSE if they would build an apgradeble VBords- even if they would price a card $1000 - they still wont make enough money as they would selling 200-600 dollar cards each

just think about it
- average computer user will buy 3-5 Vcards in his/her computing life thats about $2000
- a gamer will buy about 5-9 Vcards in his/her computing life thats about $3600
- a graphix profecional will buy 4-7 cards in his/her computing life (prices will 800-2000) thats about $11000(!)


technology is very resilient - its the manufactures pockets that aren't :confused:
 
A card is much more than just the GPU and the RAM chips. There are capacitors and resistors and a whole load of other crap in there.

If you visually compare a card from even two years ago to something right now, you will see major architectural changes. Cards just a LOT more power than before, plus new chips require new traces. And don't forget that faster and different RAM requires new timing circuitry.

Not to mention that discrete RAM would be a lot slower too.

BTW, most graphics cards sold cost under $100. It's only gamers and power users that pay $200-$1000 for a board.

And your math is really screwed up too. The average computer user does NOT spend $400 on a graphics card.
 
Back
Top