|
Page 16 of 773 |
Werelds
Special Little Man
Posts: 15098
Location: 0100111001001100
|
Posted: Thu, 24th Nov 2011 10:03 Post subject: |
|
 |
Slizza wrote: | I'm sure if AMD could have strapped more on there GPU for a good increase in performance and still manage to actually fab them with good yeilds then they would have done it in a split second.No brainer.
All i see is speculation to try ease AMD's failing with the HD6970's disappointing performance.
Red tinted glasses. |
Not a no-brainer because of their strategy, but I guess that concept is something you just can't grasp. For someone who claims to have worked (or work?) in the semiconductor industry, you happily ignore a whole bunch of concepts which Intel for example follow as well. The decision to stick with smaller dies is of course based on yields; but it's because of the general principle that a larger die on the same wafer has higher risk for defects. 3.5 years ago AMD made the decision not to do the big dies anymore (which they had been doing just like NVIDIA); that was a calculated decision and it had letters and slides going out to partners, reviewers and just about everyone who wanted more details. R700 was under 300mm^2, Evergreen just a bit above - 300 is their target now, and it's that strategy that made R700 and Evergreen the successes that they were. Cayman would've been much smaller if they hadn't been fucked by TSMC - again, Cayman is not the chip it was supposed to be, ignoring that fact and/or using it against AMD like you do is just plain ignorance.
Yes, 6970 was a disappointment, never said it wasn't. I however don't ignore the differences between the two companies and I can actually look at a chip and look at it from a technical point of view. Compared to Evergreen and keeping the 32nm problem in mind, it's still a good step forward. Also don't forget even a year ago, TSMC were still having problems producing 40nm in the quantities NVIDIA and AMD needed (as proven by availability of their products).
It's okay though, I know you either won't read the above or pick 3 keywords to "make your point". I'm done with you again, brick walls are easier to reason with.
@ HAWX2: I know you were trolling Leo, but ever seen the wireframes for it? Looks surprisingly similar to Crapsis 2 with the added bonus of 64 samples forced
That's not why it's such a piece of shit though. It's because a single 580 gets the same average framerate as 2 580's in SLI; because a 6970 has the exact same frame rate as a 6870 or 5870; because the entire 400 series from NVIDIA are DESTROYED by anything in the 500 series. Absolutely a good benchmark, yes - no artificial framerate limits in place at all 
|
|
Back to top |
|
 |
|
Posted: Thu, 24th Nov 2011 10:32 Post subject: |
|
 |
what plays in AMDs favor the most is you dont even need their own fastest card to enjoy games @ max settings, and the price , powerusage along with the performancet is very well priced which cant be said for example the 580GTX, even if its overall the fastercard.
|
|
Back to top |
|
 |
sausje
Banned
Posts: 17716
Location: Limboland, Netherlands
|
Posted: Thu, 24th Nov 2011 10:48 Post subject: |
|
 |
Breezer_ wrote: | Stop bashing GTX 480, mine never went to 90c even with the stock cooler. Ofc if you put it to some fucking generic case from 90s it will heat up. GTX 480 must be the most hated card, everyone just are making shit of it because they didnt try it itself (still is very good card, also a bargain these days). |
My 470 got to 110 degrees, a friend of me his 480 gets above 90 degrees if he doesn't set fan manually to 100% when gaming. Otherwise its around 75-80
Proud member of Frustrated Association of International Losers Failing Against the Gifted and Superior (F.A.I.L.F.A.G.S)

|
|
Back to top |
|
 |
Slizza
Posts: 2345
Location: Bulgaria
|
Posted: Thu, 24th Nov 2011 18:58 Post subject: |
|
 |
Werelds wrote: | Slizza wrote: | I'm sure if AMD could have strapped more on there GPU for a good increase in performance and still manage to actually fab them with good yeilds then they would have done it in a split second.No brainer.
All i see is speculation to try ease AMD's failing with the HD6970's disappointing performance.
Red tinted glasses. |
Not a no-brainer because of their strategy, but I guess that concept is something you just can't grasp. For someone who claims to have worked (or work?) in the semiconductor industry, you happily ignore a whole bunch of concepts which Intel for example follow as well. The decision to stick with smaller dies is of course based on yields; but it's because of the general principle that a larger die on the same wafer has higher risk for defects. 3.5 years ago AMD made the decision not to do the big dies anymore (which they had been doing just like NVIDIA); that was a calculated decision and it had letters and slides going out to partners, reviewers and just about everyone who wanted more details. R700 was under 300mm^2, Evergreen just a bit above - 300 is their target now, and it's that strategy that made R700 and Evergreen the successes that they were. Cayman would've been much smaller if they hadn't been fucked by TSMC - again, Cayman is not the chip it was supposed to be, ignoring that fact and/or using it against AMD like you do is just plain ignorance.
Yes, 6970 was a disappointment, never said it wasn't. I however don't ignore the differences between the two companies and I can actually look at a chip and look at it from a technical point of view. Compared to Evergreen and keeping the 32nm problem in mind, it's still a good step forward. Also don't forget even a year ago, TSMC were still having problems producing 40nm in the quantities NVIDIA and AMD needed (as proven by availability of their products).
It's okay though, I know you either won't read the above or pick 3 keywords to "make your point". I'm done with you again, brick walls are easier to reason with.
@ HAWX2: I know you were trolling Leo, but ever seen the wireframes for it? Looks surprisingly similar to Crapsis 2 with the added bonus of 64 samples forced
That's not why it's such a piece of shit though. It's because a single 580 gets the same average framerate as 2 580's in SLI; because a 6970 has the exact same frame rate as a 6870 or 5870; because the entire 400 series from NVIDIA are DESTROYED by anything in the 500 series. Absolutely a good benchmark, yes - no artificial framerate limits in place at all  |
Strategy is the new excuse for failure i see.
Might work for you but i'm not buying it.
If they could have made there chips better and pulled of a larger design then they would have, simple as that.
And bigger does not auto mean more expensive.
And now AMD can't be criticized at all because it's not how it was ment to be.
Yet you can waffle on about how bad you think Fermi was even though that was down to fabbing issues and not the intention.... fanboy hypocrisy much?
Spoiler: | |
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
|
|
Back to top |
|
 |
|
Posted: Thu, 24th Nov 2011 23:12 Post subject: |
|
 |
sausje wrote: | Breezer_ wrote: | Stop bashing GTX 480, mine never went to 90c even with the stock cooler. Ofc if you put it to some fucking generic case from 90s it will heat up. GTX 480 must be the most hated card, everyone just are making shit of it because they didnt try it itself (still is very good card, also a bargain these days). |
My 470 got to 110 degrees, a friend of me his 480 gets above 90 degrees if he doesn't set fan manually to 100% when gaming. Otherwise its around 75-80 |
Full load my GTX470 never gets above 80c with the fan at 80%.
|
|
Back to top |
|
 |
|
Posted: Thu, 24th Nov 2011 23:12 Post subject: |
|
 |
Last edited by Interinactive on Tue, 5th Oct 2021 04:38; edited 1 time in total
|
|
Back to top |
|
 |
|
Posted: Thu, 24th Nov 2011 23:16 Post subject: |
|
 |
400 series ran hot, but if you could find them for a discount they were still beast. I think I got my 470 for like $250 not to long after it came out. Isn't much slower then a 560ti so i've had no reason to upgrade.
|
|
Back to top |
|
 |
Frant
King's Bounty
Posts: 24658
Location: Your Mom
|
Posted: Fri, 25th Nov 2011 16:48 Post subject: |
|
 |
Slizza wrote: | Strategy is the new excuse for failure i see.
Might work for you but i'm not buying it.
If they could have made there chips better and pulled of a larger design then they would have, simple as that.
And bigger does not auto mean more expensive.
And now AMD can't be criticized at all because it's not how it was ment to be.
Yet you can waffle on about how bad you think Fermi was even though that was down to fabbing issues and not the intention.... fanboy hypocrisy much?
Spoiler: | |
|
Failure? In what way have AMD failed with their GPU's? And don't you dare pull the fanboy card, you're a fanboy parasite in these threads since you do NOTHING but flame and post false claims with no facts, not truth and no objectivity. Besides, you apparently lack the intellectual capacity to actually come up with proper arguments. Hence you being the biggest dickslapping troll in these threads.
In other words, you talk out of your ass about things you actually don't seem to have much knowledge about, just fanboy opinions based on... fanboyism.
Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn!
"The sky was the color of a TV tuned to a dead station" - Neuromancer
|
|
Back to top |
|
 |
|
Posted: Fri, 25th Nov 2011 17:08 Post subject: |
|
 |
Sort of wondering myself how someone could say the latest AMD/ATI cards have 'failed'. If they've failed then Fermi was like the planet blowing up. As much as I like my GTX470 and Nvidia drivers - The ATI cards are pushing out the same level of performance but use less power, and less transistors.
|
|
Back to top |
|
 |
Slizza
Posts: 2345
Location: Bulgaria
|
Posted: Fri, 25th Nov 2011 22:34 Post subject: |
|
 |
HD 6970 performs roughly the same as a GTX 570 give and take a couple frames depending on game to each card. It costs just as much. You can get a GTX 570 here for a bit less than a HD6970.
GTX 570 does things better, giving you a better dx11 + 3d + physx experience, Plus AMD users find themselves crying for driver problems way more often than Nvidia users.
Leaving you with no good reason to buy a HD 6970 unless your primary concern is saving a absolutely tiny amount of electric nobody would notice..or being a blind fanboy for AMD.
AMD failed completely to produce something to rival a GTX 580 leaving nvidia the only game in town for the top end single GPU.
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
|
|
Back to top |
|
 |
|
Posted: Sat, 26th Nov 2011 00:27 Post subject: |
|
 |
There are always reasons. Don't know why one needs to pick a side on this - I go for the best card available that suits my needs and means, no matter the brand. Last time it was the 5870 because 400 series was late and not worth the money in the end anyway, and this time it looks like it's gonna be the 7000 series card, unless things change. I have 2 monitors and an HDTV hooked up, so Nvidia is a no go for now unless I want to buy another card.
Better 1080p 3D gaming support is pretty much the only somewhat interesting feature Nvidia has to offer for me.
Can't even remember the last game that had any PhysX implementation worth mentioning before Arkham City. Even so, works fine with hybrid drivers if it's so important.
Driver issues talk might as well be ignored: up until the 5870 all my previous GPUs were Nvidia's and I had pretty much the same experience on both sides when it came to drivers.
|
|
Back to top |
|
 |
sausje
Banned
Posts: 17716
Location: Limboland, Netherlands
|
Posted: Sat, 26th Nov 2011 01:30 Post subject: |
|
 |
Mchart wrote: | sausje wrote: | Breezer_ wrote: | Stop bashing GTX 480, mine never went to 90c even with the stock cooler. Ofc if you put it to some fucking generic case from 90s it will heat up. GTX 480 must be the most hated card, everyone just are making shit of it because they didnt try it itself (still is very good card, also a bargain these days). |
My 470 got to 110 degrees, a friend of me his 480 gets above 90 degrees if he doesn't set fan manually to 100% when gaming. Otherwise its around 75-80 |
Full load my GTX470 never gets above 80c with the fan at 80%. |
Log from back then: http://www.kzn-clan.nl/aida_logs_2010-11-20_00-42-13_stat.htm
Like i said in another topic:
sausje wrote: | So yea, everyone has their own issues with things, for some its nonstop issues with nvidia and none with amd, and visa versa. |
Proud member of Frustrated Association of International Losers Failing Against the Gifted and Superior (F.A.I.L.F.A.G.S)

|
|
Back to top |
|
 |
tonizito
VIP Member
Posts: 51473
Location: Portugal, the shithole of Europe.
|
Posted: Sat, 26th Nov 2011 02:42 Post subject: |
|
 |
Slizza wrote: | You can get a GTX 570 here for a bit less than a HD6970. | Wasn't expecting it, but it's actually the same in here:
HD6970 - 324€
GTX570 - 305€

boundle (thoughts on cracking AITD) wrote: | i guess thouth if without a legit key the installation was rolling back we are all fucking then |
|
|
Back to top |
|
 |
Frant
King's Bounty
Posts: 24658
Location: Your Mom
|
Posted: Sat, 26th Nov 2011 13:41 Post subject: |
|
 |
Strange really. The 570 is better than the 6970.
Anyway, I bought the 6950 2GB and unlocked the shaders and run it faster than 6970 stock, and I paid a lot less (€250). The 6970 is sort of overpriced. I had a specific budget and it stood between the 6950 and the 570, and I couldn't scrape together enough to get the 570 at that particular price bracket. So the next best thing at the lower price bracket was the 6950 (I didn't even know it was unlockable until after I had bought it).
Pizza wrote: | Leaving you with no good reason to buy a HD 6970 unless your primary concern is saving a absolutely tiny amount of electric nobody would notice..or being a blind fanboy for AMD.
|
Only a blind nV-fanboy of nVidia would say anything like that. What do you know of people specific needs? Can your 570 run 6 screens out of the box?
Pizza wrote: | AMD failed completely to produce something to rival a GTX 580 leaving nvidia the only game in town for the top end single GPU. |
AMD never claimed they were going for the top single GPU, that hasn't been their strategy for years. So how can they fail with something they never intended to do in the first place? THIS is how you fanboys "reason". Besides, the 580 is a stupid purchase. Overpriced, overheating etc.. The 570 is the obvious choice since it can be clocked way above 580-performance.
And using your "reasoning" the best single card today is the Radeon HD 6990. To me it's totally pointless, I'd rather go crossfire if I ever lost my mind.
You cannot and will not discuss something on neutral grounds. Your existence seem to be focused on childish and pathetic brand wars. You're like one of those dimwits that go to soccer games only to throw in bengal fires and fight the opposing teams supporters after the game is over no matter what the score was.
Grow up and wean yourself off mental diapers.
Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn!
"The sky was the color of a TV tuned to a dead station" - Neuromancer
|
|
Back to top |
|
 |
Slizza
Posts: 2345
Location: Bulgaria
|
|
Back to top |
|
 |
|
Posted: Sun, 27th Nov 2011 04:39 Post subject: |
|
 |
Rumours have surfaced that the nextgen ATI/AMD (7900 most likely) cards will be using Rambus XDR2 memory.
Quote: | We have previously reported the use of AMD's official blog on the 7900 XDR2 memory response.
Now, we learn, Radeon HD 7900 series graphics cards may also use two kinds of XDR2 and GDDR5 memory.
We found that the depth of excavation, AMD in 2006, has been paid to the memory and memory Rambus's patent costs, AMD, and in some memory and memory controller on the use of patented technology from Rambus.
We know, Rambus XDR2 memory GDDR5 memory compared to the traditional, double the bandwidth, power consumption reduced by 30%, this graphics chip and graphics card manufacturers are very attractive.
In addition, we learned, Rambus has been developed to support both XDR2 memory controller and GDDR5 memory, so, AMD just a memory controller to allow the South Island high-end chip supports two types of memory. Of course, Radeon HD 7900XDR2 model higher performance, more expensive, Radeon HD 7900 GDDR5 model performance is relatively lower, the price could be reduced.
It is reported, AMD rival NVIDIA While Rambus has also been paid to the patent, but NVIDIA has not been XDR2 memory controller patents, patent costs, and the two sides still pending in the lawsuit, therefore, NVIDIA is unlikely to use them in the Kepler XDR2 memory.
|
Source: http://news.mydrivers.com/1/209/209885.htm
If this is true, this is indeed good news since XDR2 can transport DOUBLE the data of what a GDDR5 can.
|
|
Back to top |
|
 |
Frant
King's Bounty
Posts: 24658
Location: Your Mom
|
|
Back to top |
|
 |
|
Posted: Sun, 27th Nov 2011 04:43 Post subject: |
|
 |
|
|
Back to top |
|
 |
Frant
King's Bounty
Posts: 24658
Location: Your Mom
|
|
Back to top |
|
 |
LeoNatan
☢ NFOHump Despot ☢
Posts: 73326
Location: Ramat HaSharon, Israel 🇮🇱
|
Posted: Sun, 27th Nov 2011 04:54 Post subject: |
|
 |
Frant wrote: | @ Slizza: At least I don't use my graphics card to have buttsecks with.
No, but seriously, it's useless to try to discuss anything with someone so brainwashed by a brand-slave like yourself. God forbid you looking past logotypes and brand names and focusing on price/performance. I guess that makes you not-so-intelligent.
Get educated, grow a brain and stop trolling. |
But, it's the way it's meant to be played... 
|
|
Back to top |
|
 |
|
Posted: Sun, 27th Nov 2011 11:00 Post subject: |
|
 |
|
|
Back to top |
|
 |
sausje
Banned
Posts: 17716
Location: Limboland, Netherlands
|
Posted: Sun, 27th Nov 2011 11:06 Post subject: |
|
 |
dx 11.1?
Just as useless as ATI's 10.1?
Proud member of Frustrated Association of International Losers Failing Against the Gifted and Superior (F.A.I.L.F.A.G.S)

|
|
Back to top |
|
 |
|
Posted: Sun, 27th Nov 2011 11:22 Post subject: |
|
 |
Directx 11.1: http://msdn.microsoft.com/en-us/library/windows/desktop/hh404562(v=vs.85).aspx
"In 2012, NVIDIA is expected to embrace AMD's sweet-spot strategy, with no massive die. Instead, the top single GPU part will be GK104, featuring 384-bit 1.5 GB GDDR5 memory. GK104 is said to push out 2 TFLOPS, 30% higher than GTX 580. However, despite the smaller die, 4Gamer claims it consumes over 250W power. GK104 will release bang in the middle of 2012, perhaps during Computex time. Following right after GK104 will be GK110 - a dual GK104 flagship, thus completing NVIDIA's line-up for most of 2012 - remarkably similar to AMD's sweet spot strategy. "
Only 1.5Gb of vram wtf
http://vr-zone.com/articles/report-nvidia-28nm-desktop-gpu-roadmap/14067.html
|
|
Back to top |
|
 |
Werelds
Special Little Man
Posts: 15098
Location: 0100111001001100
|
Posted: Sun, 27th Nov 2011 12:19 Post subject: |
|
 |
Don't read too much into that roadmap and the stuff posted on 4Gamer. The chip names don't match NVIDIA's naming scheme which they've used from G80 and up (before then it was a mess). No GK100 and a dual-GPU solution with a "real" name? Perhaps they meant double the size of GK104, but it is very unlikely NVIDIA would give a dual-GPU solution a name meant for a single chip. Even being double GK104 would not make sense however, as traditionally there are more differences between the top models (G80, GT200, GF100, GF110) and their smaller versions (G82/84/86, GT 214/215/216, GF104/106/108..you get the idea) - not saying it won't happen, it just doesn't fit into everything they've done the past 5 years (the glorious G80 came out November 2006 ). Them going for AMD's strategy sounds plausible, but we'll see; GF104 (GTX 460) was a massive hit, as is GF114 (GTX 560 Ti).
@ wizarD: also quite unlikely; that rumour has been floating around since July already. There's not much point in XDR2 from a bandwidth point of view, GDDR5 still has some room left and even that isn't being saturated at the moment. XDR2 does have some other benefits though (which I won't bore you with here), so again, don't read too much into it.
We simply don't know enough about AMD's new architecture yet (nor will we until right before launch probably) and NVIDIA's stuff is still too far away to get any details, so all of the above comes from people making guesses based on very small snippets of information.
|
|
Back to top |
|
 |
Slizza
Posts: 2345
Location: Bulgaria
|
Posted: Sun, 27th Nov 2011 14:12 Post subject: |
|
 |
|
|
Back to top |
|
 |
|
Posted: Mon, 12th Dec 2011 12:25 Post subject: |
|
 |
|
|
Back to top |
|
 |
Werelds
Special Little Man
Posts: 15098
Location: 0100111001001100
|
Posted: Mon, 12th Dec 2011 14:03 Post subject: |
|
 |
Also completely off and showing the lack of knowledge on their behalf
Quote: | 384-bit wide GDDR5 memory interface, memory clock slightly below 1 GHz, target bandwidth of 240~264 GB/s |
((Bus Width / 8 ) (Clock in Hz) (Memory type MP)) / 1000^3 = Theoretical Bandwidth in GB/s
MP for GDDR5 is 4; so rewrite that formula and you can figure out that for 240 GB/s one needs a clock of 1250 MHz already, not exactly "just under 1GHz". AMD have it running at 1375 at the moment (6970 speeds) for a bandwidth of 264 (which is where that top end "guess" comes from). 1.5 GHz might just be doable according to someone I know at MSI, that would result in 288 GB/s
Verify your shit please TPU or at least don't blindly copy/paste shit someone emails to you. As for the other bits: die size might very well be accurate, it's bigger than Cypress but still relatively small. The "2048 1D cores" is bullshit, no architecture details are out yet but AMD said at their events that you can't really look at these GPUs the same way anymore (which has made me *really* curious ).
And for the math-impaired, here's the formula with proper numbers:
#1: 384 / 8 = 48 (Bytes, for the record)
#2: 1250 MHz = 1250 000 000 Hz (M = Mega = 1 million, so 1250 million Hz)
#3: GDDR5 MP = 4
Result: 48 * 1250 000 000 * 4 = 240000000000 Bytes/second, divide by 1000^3 (1 billion) to get GigaBytes (note I'm going by Giga, not Gibi which would be 1024 but is hardly ever used) and presto: 240 GB/s.
|
|
Back to top |
|
 |
Sin317
Banned
Posts: 24321
Location: Geneva
|
Posted: Mon, 12th Dec 2011 15:16 Post subject: |
|
 |
why you need more than 1.5gb of vram ?
|
|
Back to top |
|
 |
sausje
Banned
Posts: 17716
Location: Limboland, Netherlands
|
Posted: Mon, 12th Dec 2011 16:22 Post subject: |
|
 |
Modded Skyrim^^
Proud member of Frustrated Association of International Losers Failing Against the Gifted and Superior (F.A.I.L.F.A.G.S)

|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
Page 16 of 773 |
All times are GMT + 1 Hour |
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
Powered by phpBB 2.0.8 © 2001, 2002 phpBB Group
|
|
 |
|