Is there any reason to buy an gsync monitor now? I want to know because nvidia support freesync now.
Well supposedly with a G-Sync display you get:
LFC (Low Framerate Compensation) by default which is not available on all FreeSync monitors.
Less Ghosting and whatever Nvidia quality certification means (according to reviews just better overall performance of the panel).
I have a tiny dumb question... Do you think that companies are going to fix up their RTX 20#0 GPU issues in the future and slap a sticker somewhere in the box to say something shorter than "Hey, this is the fixed card!. Look nowhere else!" like "V.2"?
ASUS X570 TUF GAMING PLUS, 32GB DDR4@2666 ,RYZEN 5800X3D (NO OC),GIGABYTE RTX 4070 Super GAMING OC, Western Digital Blue 4TB 5400RPM + SAMSUNG 860 EVO 500+1TB GB SSDs , OEM SATA DVD 22xNoctua NH-D15 Chromax Black, BenQ XL2420T Case: Be Quiet! DARK BASE PRO 901. PSU CORSAIR RM1200 SHIFT
i doubt it, the promised "ultra easy fix for the 970 low bandwith problem" never came either and it was later shown that its a design fuckup that wasn´t easily fixable. expect the same with this generation since its nVidiA
paxsali wrote:
Now, I don't know what hardware costs in Poland, I guess it's cheaper because everything is stolen from Germany and resold...
I have a tiny dumb question... Do you think that companies are going to fix up their RTX 20#0 GPU issues in the future and slap a sticker somewhere in the box to say something shorter than "Hey, this is the fixed card!. Look nowhere else!" like "V.2"?
I doubt they want to call attention to the problems, much like when the 1070 cards had the memory issues. In the end, they would hope you may not notice the problems on your high end videocard so they can avoid replacing it for free so it is better not to draw more attention to the issues then needed as not everyone does their research on those sort of tech issues so may not even know of the problems.
ASUS X570 TUF GAMING PLUS, 32GB DDR4@2666 ,RYZEN 5800X3D (NO OC),GIGABYTE RTX 4070 Super GAMING OC, Western Digital Blue 4TB 5400RPM + SAMSUNG 860 EVO 500+1TB GB SSDs , OEM SATA DVD 22xNoctua NH-D15 Chromax Black, BenQ XL2420T Case: Be Quiet! DARK BASE PRO 901. PSU CORSAIR RM1200 SHIFT
AMD just announced the Radeon VII (as in Radeon Seven with the seven synonym for 7nm). The product is based on 60 Compute units which means times 64 shaders per cluster that is 3840 shader processors. That means this is a Vega 64 shrunk towards 7nm with some fewer shaders active.
We talked about Vega 20 many times, it is the 7nm Vega iteration of the existing GPU. The GPU is paired with 16GB of HBM memory and can Boost in the 1800 MHz ranges. AMD gets 25% more performance out of the product with the same amount of power (VEGA10). In some slides, it's shown that the card performance at a GeForce RTX 2080 performance level.
AMD has not shared any further information on architecture at this time. AMD will bundle Division2 with the card as well as select Ryzen 5 and 7 processors.
Pay Attention to the moment when the DEVISION 2 dev talks about "Will support All the new Technologies of" and Lisa stop's Him! I think she was making sure he will not spell the secret - I think AMD is preparing their own RayTracing!
..:: Life - A sexually transmitted disease which always ends in death. There is currently no known cure::..
Pay Attention to the moment when the DEVISION 2 dev talks about "Will support All the new Technologies of" and Lisa stop's Him! I think she was making sure he will not spell the secret - I think AMD is preparing their own RayTracing!
im doubtful about that
they announced the gpu so whats stopping them to advertise all the features?
what i dont get is why are so focused on using hbm memory on their gpu?
it will just make production harder for them, the gpu will be scarcely avaible like the vega 56 and 64
Besides why the fuck do you put 16 gb ram on it anyhow?
That's the only reason why the gpu is 700 dollars, not because they put in a"secret"technology they didnt announce yet.
16GB is very nice for workstation use (video editing at high resolutions etc), this card offers pretty amazing value at 699 dollars, its like gaming / workstation hybrid card. Still disappointed that they are no match for nvidia in gaming performance, not that i even care that anymore, since gaming is shit these days.
what i dont get is why are so focused on using hbm memory on their gpu?
it will just make production harder for them, the gpu will be scarcely avaible like the vega 56 and 64
Besides why the fuck do you put 16 gb ram on it anyhow?
That's the only reason why the gpu is 700 dollars, not because they put in a"secret"technology they didnt announce yet.
History is unfortunately repeating itself.
16 GB makes it very attractive for workstation and compute needing the memory, also improvement to bandwidth and performance as a result.
Cost though well HBM2 is still higher than GDDR5 and even GDDR6 by a fair bit and from reading it looks like this configuration would cost around 320 US Dollar so we have almost half of the cards price just from the VRAM alone.
Thus the higher price for the GPU itself but that does make it hard to compete against the now similarly priced 2080 which NVIDIA could then respond to by price dropping the 2060 or 2070 a bit or even the 2080 itself down the line if AMD is only barely going to match it's performance level.
(Still for a die shrink and slight overclock the reported performance gain is far better than I would be expecting so actual non-promo and marketing benchmark figures will be interesting to see closer to launch in early February.)
It's been known Freesync implementations vary wildly. It's the reason some people defended the price of Gsync, because software glitches aside there was a higher chance of working right.
TWIN PEAKS is "something of a miracle."
"...like nothing else on television."
"a phenomenon."
"A tangled tale of sex, violence, power, junk food..."
"Like Nothing On Earth"
Looks like the green goblins will still hold the whip hand (as expected), not quite the fierce competitive environment that would be needed to improve the flabby and overpriced current situation. I'm very interested in the future Ryzen though, so there's that
Will still wait before making a full upgrade though (assuming that the PC doesn't commit seppuku *touches wood*), we're still in a transitional era from a gamer's perspective with little incentive to push boudaries at the moment.
Reading r/amd and people ( i mean fanboys sorry ) justifying same price as 2080 but less features ( no RT, no DLSS etc ) just cause ... AMD. Makes one wonder.
Anyway
Main PC : I7 12700, MSI Ventus RTX 4090 24gb, Alienware AW3423DW QD-OLED
Laptop : I5 4200H @ 3400mhz boost, GTX 850m 2gb Vram DDR3, 4gb RAM DDR3
Derpsole : Playstation 5 disc edition, Ninty Switcherino
TV+audio: LG CX 65" / Sonos ARC + SL ones + Sonos sub 3
VR Headset: Meta quest 2 airlinked
Reading r/amd and people ( i mean fanboys sorry ) justifying same price as 2080 but less features ( no RT, no DLSS etc ) just cause ... AMD. Makes one wonder.
Anyway
I wonder how many titles will actually support one of both features by the time the next Gen Of GPUs come out. And how many of these titles will be actually on peoples gaming menu. Not that I dislike the features, but if there is no content with them, they will be useless.
And that Radeon VII card is indeed to expensive. And again aimed at both gaming and pro markets. High End gaming is just not AMDs target atm.
"Enlightenment is man's emergence from his self-imposed nonage. Nonage is the inability to use one's own understanding without another's guidance. This nonage is self-imposed if its cause lies not in lack of understanding but in indecision and lack of courage to use one's own mind without another's guidance. Dare to know! (Sapere aude.) "Have the courage to use your own understanding," is therefore the motto of the enlightenment."
Gay Fantasy XYZ has DLSS and Battleturd V has RTX , amazing, need to buy asap exploding 2080ti.
Even if the games arent up your alley ( not mine either that much ) the tech is there and is impressive. Why bother with the product giving you the worse xperience considering the same price? Even if you wont play ALL the games with it but just a few?
RTX will be in more upcoming games ( metro exodus soon ), some more suited to a slower pace to ‘smell the roses’. So will DLSS ( coming to bf v on 15th of jan i think? So 4k60 rtx on should be doable on a 2080)
I get it’s not really worth it for you since you already have a 1080ti, but for others upgrading ( like i did from a 1070 ) or having to choose between r7 and rtx 2080?
Main PC : I7 12700, MSI Ventus RTX 4090 24gb, Alienware AW3423DW QD-OLED
Laptop : I5 4200H @ 3400mhz boost, GTX 850m 2gb Vram DDR3, 4gb RAM DDR3
Derpsole : Playstation 5 disc edition, Ninty Switcherino
TV+audio: LG CX 65" / Sonos ARC + SL ones + Sonos sub 3
VR Headset: Meta quest 2 airlinked
Last edited by russ80 on Thu, 10th Jan 2019 06:55; edited 2 times in total
Reading r/amd and people ( i mean fanboys sorry ) justifying same price as 2080 but less features ( no RT, no DLSS etc ) just cause ... AMD. Makes one wonder.
Anyway
I wonder how many titles will actually support one of both features by the time the next Gen Of GPUs come out. And how many of these titles will be actually on peoples gaming menu. Not that I dislike the features, but if there is no content with them, they will be useless.
And that Radeon VII card is indeed to expensive. And again aimed at both gaming and pro markets. High End gaming is just not AMDs target atm.
described about every new tech innovation ever, raytracing lmao, realistic = seeing improvement after 3-4 more generations and dev's wont do shit until both vendors support it and support it well, but sure $2400 NVIDIA RTX well spent
if its within 10% performance of the 2080 + $100-150 cheaper i'll probably get one.
Signature/Avatar nuking: none (can be changed in your profile)
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum