The problem is that some games are so badly unoptimized that you will need a beefy system to run them decently i.e Starfield or The Last of Us Part 1 or Dragon's Dogma 2
there's also Alan Wake 2 ,Hellblade2 and Avatar frontiers of Pandora that do actually look good
You kind of prove my point. DD2 ran shit regardless of the GPU and looked like a ps3 game, why waste time on garbage like that?
AW2 did look great, I did play half of it but the story just didn't do it for me. Does it justify spending $3K, I don't think so. The other games are meh/skip, honestly what exciting upcoming game has you wanting a 5090? even the 4090! - what exciting release did I miss out on not shelling money to NVIDIA?
@AmpegV4 Well, to be honest I've met a lot of people recently that do have a 4090. Why? They do computing, they do AI natively for different purposes, they do rendering and heck, some even do 3D (real stereo 3D for DX12 games, custom 2D>3D conversions and mods) with, let me put it this way... a lacking 4090 in most cases. A guy just recently told me he's waiting for the 5090 to be able to surpass 30fps
For gaming? I'm with you, I can't even max out my 3080. I was even able to run everything with DLSS and FSR3 FrameGen with my old 2060super!
You kind of prove my point. DD2 ran shit regardless of the GPU and looked like a ps3 game, why waste time on garbage like that?
AW2 did look great, I did play half of it but the story just didn't do it for me. Does it justify spending $3K, I don't think so. The other games are meh/skip, honestly what exciting upcoming game has you wanting a 5090? even the 4090! - what exciting release did I miss out on not shelling money to NVIDIA?
Afaik DD2 overburdens the CPU with npc calculations
7 years old card, NO DRIVER SUPPORT and beats team green 1060 by a mile and they still have driver support
fuck nvidia AND AMD
fucked by ram green, fucked by drivers red
i chose black , in the hole the sun dont shine where they both belong
Still running this on one of my PC's that I use for the not so demanding stuff and honestly this card is a legend, it lasted me so many years and got me by just fine during the GPU shortage, mining & scalper hordes etc. Still runs things and functions great.
I will keep it until it ceases breathing as it feels like a waste to replace it.
meanwhile today u pay up the ass for ngreedia to give u crumbs of VRAM so u can get gaped by them the next year when it's intentionally made obsolete. Such a waste of processing units.
I realy want to upgrade but having doubts about what I want. Money isn't the issue, but I don't like to spend 3K + for a noisy heat oven next to my screen. Wating for the 5xxx nvidia releases.
Since my 2070 super died (heat issues/wear, having issues with it after 3 years of ownership ) I bought a 4060. Crap card, but enough to run curent gen games with some tweaking at 1440p and it's only for a year (should have done some more research and spend the extra buck for a 4060ti with 16 gigs though). But using it for some time it's such a relieve having a low power GPU.
The silence. No heat. No feeling like I'll have to cap my FPS or this shit will blow up over time (although this strategy didn't saved my 2070 super). My rigs feels so much more stable and durable.
These times just suck for pc hardware.
-Very low increase in performance over the years
-Insane and ever increasing prices
-High end hardware needing insane power and cooling with high wear.
-No pc specific games that make it even wortwhile to buy a high end pc.
unless you really have to have the highest end everything (including cooling). There is very little reason to buy hardware these days. Terrible prices, old hardware is still holding up well.
When you go highest of highest end there will be quirks, heat being the biggest issue, hell you even need active cooling for the latest nvme SSD's or they will fry themselves
Only thing bad about the 4060 is its price depending on where you are in the world.
The industry just wants you to buy a new rig every 3 year. It's designed to fail. They know that chip design isn't progressing as fast as in the past and there is little reason to upgrade otherwise.
In the past people overclocked their CPU's and GPU's cause the industry was very conservative in that aspect. Cause people needed to upgrade anyway. I've never had pc hardware failing after the warranty period in the past. Now it's the other way around. Everything is overclocked out of the box, hardware is pushed to it's limits by design and stability or wear is not an issue.
From my experiences and observations that seems to affect high end parts the most as
they generate a lot of heat and munch on too many watts to push the bar higher.
Cooling is something essential to think about if you care at all about keeping a part for a
long long time. And not replace it with a year or two.
And honestly if all you do is game then it's not like the industry is pushing out bangers like
it's 2004, it's just broken cancer out there most of the games I Actually find any fun
playing can run on a potato.
You should clean your case and components thorough every 1-2 years and change cooling paste every 2 years or you are gonna have issues with high end hardware after some time. Most people don't do that. Neither do I, I only use a dust spray when i'm having heat issues to clean the case a little.
Changing cooling paste on a GPU is a shitty job. You can break the card cause the paste can glue stuff together and you can easily break a cable. While it's essential maintenance, it's only for enthousiasts. As I said the industry wants you to buy a new card. They don't design them for easy maintenance. And they don't advice you doing so, even more, removing fans ends warranty.
I'm 99% sure my 2070 has a cooling paste issue cause it crashes warming up and after that it's fine. Watched a couple of how to change the paste and saw something break in almost half of the video's.
Since i'm lazy and don't want to get frustrated with a broken GPU after 2 hours of tedious work I just bought a new card.
for new AAA gaming/3D workstation builds it makes zero sense, but for babby's minecraft/e-sport or daddy's Crpg/indie/retro/basic video editing use it is still more than capable.
In general aside from ray tracing 3D gaming performance has kinda plateau'd, and for most people a $700-850 "mid range" new card is absolutely absurd.
I still have one on one of my machines and it still surprises me to this day how well it still manages to run things considering its age.
5070 will come with 12GB VRAM, 50% more than the 1070 8 years ago.
"Enlightenment is man's emergence from his self-imposed nonage. Nonage is the inability to use one's own understanding without another's guidance. This nonage is self-imposed if its cause lies not in lack of understanding but in indecision and lack of courage to use one's own mind without another's guidance. Dare to know! (Sapere aude.) "Have the courage to use your own understanding," is therefore the motto of the enlightenment."
Signature/Avatar nuking: none (can be changed in your profile)
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum