Page 189 of 772 |
|
Posted: Thu, 19th Jun 2014 21:17 Post subject: |
|
 |
New chart or GTFO
ASUS X570 TUF GAMING PLUS, 32GB DDR4@2666 ,RYZEN 5800X3D (NO OC),GIGABYTE RTX 4070 Super GAMING OC, Western Digital Blue 4TB 5400RPM + SAMSUNG 860 EVO 500+1TB GB SSDs , OEM SATA DVD 22xNoctua NH-D15 Chromax Black, BenQ XL2420T Case: Be Quiet! DARK BASE PRO 901. PSU CORSAIR RM1200 SHIFT
|
|
Back to top |
|
 |
|
Posted: Fri, 20th Jun 2014 10:49 Post subject: |
|
 |
Can saphire tri-x 290 be modded to X version (this version is OCed already and doesnt have that quiet/uber mode, its pretty quiet as it is)? is there any way to check that?
|
|
Back to top |
|
 |
Frant
King's Bounty
Posts: 24645
Location: Your Mom
|
Posted: Fri, 20th Jun 2014 11:25 Post subject: |
|
 |
Extremely unlikely since the specs are different, ie. the silicon has been lasercut to disable a "core" (256 streamprocessors and 16 texture units).
Long gone are the days when you could use the pencil trick (oh lovely Athlon64) or simply flash a different firmware to the GPU to get the closed units opened. In fact, I think that hasn't worked since the Radeon 9800 XT/XTX where you could flash the XTX bios to the XT card and get the full XTX spec (if you weren't unlucky and got a GPU that actually had broken parts that had been disabled in firmware and ended up with a crashing/artifacting card until you flashed it back).
The difference between the 290 and 290X is marginal though... Take it as far as it goes as far as safe overclocking is concerned and you'll be far into 290X land.
Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn!
"The sky was the color of a TV tuned to a dead station" - Neuromancer
|
|
Back to top |
|
 |
|
Posted: Fri, 20th Jun 2014 11:37 Post subject: |
|
 |
Thanks, won't bother then.
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
|
Posted: Fri, 20th Jun 2014 12:00 Post subject: |
|
 |
I`m sure he already knows that...
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
Frant
King's Bounty
Posts: 24645
Location: Your Mom
|
Posted: Fri, 20th Jun 2014 13:37 Post subject: |
|
 |
Hmm, I thought it was only possible with the first rushed out batch of reference cards from AMD to avoid a soft release.
I wouldn't risk a bricked card though, the difference between the 290 and 290X at same clocks = ~4 %.
This thread is pretty good to check out the possibility (and risks and performance difference): http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
I still think it's extremely unlikely that you'll find an unlockable card today.
And people using GPU-Z to "prove" unlocking are making a mistake. The driver gets info from the flashed firmware which says it's 2816 Streamprocessors but that doesn't mean the card is unlocked. Sometimes it will even change the name of the RAM on the card based on the info from the firmware (ie. from Elpida to Hynix). Basically it shows the info from the flashed firmware and doesn't actually scan the card/GPU/RAM (or can for that matter) the only way to know if the card is truly unlocked is to do very repeated benchmarks (forget 3dmark etc. since they all tend to deviate by +- 5% from test to test depending on a ton of things meaning you will have a hard time knowing). Also note that flashing the firmware means that your card will use tigthter timings on the RAM with perhaps slightly different voltages...
Not worth it really. I did flash my XT to XTX but it was more for the heck of it than anything else. I was lucky enough to have a stable flash but it wasn't particularly noticeable.
(I've bricked other cards that I tried to flash upwards that actually had a broken cluster or something and had to borrow a second card to reflash the original firmware back).
Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn!
"The sky was the color of a TV tuned to a dead station" - Neuromancer
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
Frant
King's Bounty
Posts: 24645
Location: Your Mom
|
Posted: Fri, 20th Jun 2014 14:36 Post subject: |
|
 |
Yes, I was so enthusiastic that I bought a "pen" of chemical silver formula (normally used to repair traces on circuit boards. Recently used it to repair a damaged Amiga 3000 motherboard (my own fault, I unfortunately scratched a tiny surface with the soldering iron when I was attempting the FPU mod (overclocking it to twice the frequency). Suddenly all the Fast RAM in the machine was gone and it took a long time to find the broken trace (since there were ~40 twisting traces going from the RAM-banks to the RAM-controller chip), mask the traces on the sides (imagine masking a 0.5mm line) and opened that old chemical silver "pen" and dipping the screwdriver for electronics into that and then "painting" the trace.. wait for it to "set", boot up and see it work again.
Also bought a 030-accelerator card for my A1200 (M-Tec 030) which had a non-functional FPU-socket as well as a dead leaking battery. Fixed that, found the fault (a dead capacitor between CPU and FPU-socket) and took a capacitor from an old Epox P2 motherboard with same farad values and tolerance and sold it for more than I had paid and bought the best of them all, the Phase 5 Blizzard MK IV 68030/MMU with 68882 FPU @ 50MHz. Changed the 32MB SIMM to a 64MB SIMM and changed the 50MHz DIL-crystal to a 55.5MHz, getting a 60MHz (but need better cooling, the 030 is getting really hot as it is).
Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn!
"The sky was the color of a TV tuned to a dead station" - Neuromancer
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
JBeckman
VIP Member
Posts: 34995
Location: Sweden
|
Posted: Tue, 24th Jun 2014 08:14 Post subject: |
|
 |
http://www.guru3d.com/news_story/thines_v_by_one_hs_brings_nvidia_g_sync_technology_to_4k_displays.html
Quote: |
THine Electronics, Inc., the global leader in high-speed serial interface technology and provider of mixed-signal large-scale integration semiconductors, today announced that its high-speed interface technology, V-by-One HS, will support NVIDIA G-SYNC Technology which synchronizes display refresh rates to the GPU, eliminating screen tearing and minimizing display stutter and input lag.
V-by-One HS Technology supports higher data transmission rates inside digital equipment like monitors while reducing the number of cables and connectors. The enhanced performance and lower costs of V-by-One HS has made it the internal interconnect choice of major 4Kx2K televisions and PC monitors. In May 2014, Acer announced the world's first 4Kx2K gaming monitor, the XB280HK, which integrates NVIDIA G-SYNC technology for a vivid and responsive gaming experience.
"THine's V-by-One HS technology has penetrated global display markets such as 4Kx2K televisions," said Mr. Kazutaka Nogami, President and CEO of THine Electronics. "We look forward to collaborating with NVIDIA and display OEMs to bring to market new high-resolution displays featuring THine's V-by-One HS and NVIDIA G-SYNC technology."
V-by-One HS technology will be available widely in various markets such as automotive, surveillance and security, multi-functional printers, displays, robotics and amusement markets.
Key benefits of V-by-One HS
High transmission quality with high performance equalizer for noisy conditions
High data transmission quality solving cable skew problems with high speed Serializer / Deserializer using clock data recovery (CDR) technology
Lower electro-magnetic interference (EMI) with clock embedded transmission, no reference clock at receiver
Reduction of total cost and board space by optimizing cables and connectors
Seamless transition to V-by-One HS minimizing change of device input/output and peripheral design
Lower energy consumption with variable transmission speed: 600 Mbps to 4.0 Gbps
|
So with this I guess GSync can now work on resolutions up to 4K? (Well ASUS is already working on one.) Of course it will still need monitors with this technology so it's up to their various partners how they want to proceed I suppose.
(3840x2160 or 3840x2400)
|
|
Back to top |
|
 |
JBeckman
VIP Member
Posts: 34995
Location: Sweden
|
Posted: Tue, 24th Jun 2014 13:55 Post subject: |
|
 |
|
|
Back to top |
|
 |
JBeckman
VIP Member
Posts: 34995
Location: Sweden
|
Posted: Tue, 1st Jul 2014 14:35 Post subject: |
|
 |
http://www.guru3d.com/news_story/amd_tonga_gpu_to_replace_radeon_r9_280.html
Quote: |
AMD Tonga GPU to replace Radeon R9 280
And apparently it is happening in August already. Chinese VR Zone posted that AMD’s upcoming Tonga GPU will arrive sometime this August in order replace the GPU you guys all know as Tahiti PRO, which is the R9 280 GPU located in the mid-end range. Little is officially known about long rumored Tonga, but the GPU should be locked and loaded with a 256bit interface that has 32 CUs and 2048 Shader processors. Next to that it should get 32 ROPs and 128 Texture memory units.
Tonga might be based on GCN 2.0. The card would become available with a memory capacity of 2 GB and perhaps we'll see some 4 GB variants. The Tonga graphics architecture will be built on the 28nm node from Global Foundries and would feature a new design scheme introducing latest architectural improvements such as new ACE (Asynchronous Compute Engine) Units and more focus towards compute shaders. The main focus of the Tonga GPU is said to be the power efficiency. Tonga will keep the basic technologies of the Radeon lineup such as Mantle, TrueAudio and XDMA for CrossFire support.
|
Have to look up what Tonga is now so I stop associating it with Thong which is hopefully something entirely different from what Tonga is, maybe it's an island again.
|
|
Back to top |
|
 |
Werelds
Special Little Man
Posts: 15098
Location: 0100111001001100
|
Posted: Tue, 1st Jul 2014 14:41 Post subject: |
|
 |
Tonga is a kingdom, somewhere east of Australia
So yes, Southern Islands 
|
|
Back to top |
|
 |
|
Posted: Wed, 2nd Jul 2014 16:11 Post subject: |
|
 |
|
|
Back to top |
|
 |
JBeckman
VIP Member
Posts: 34995
Location: Sweden
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
|
Posted: Thu, 3rd Jul 2014 03:58 Post subject: I have left. |
|
 |
|
|
Back to top |
|
 |
|
Posted: Thu, 3rd Jul 2014 04:06 Post subject: |
|
 |
Last edited by Interinactive on Tue, 5th Oct 2021 01:59; edited 1 time in total
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
JBeckman
VIP Member
Posts: 34995
Location: Sweden
|
Posted: Mon, 7th Jul 2014 17:09 Post subject: |
|
 |
http://forums.guru3d.com/showthread.php?t=390877
Quote: |
EDITORIAL — In case you didn’t know that already, NVIDIA will soon launch new Maxwell graphics processing unit,
the GM204. Today we bring you the latest information about this particular processor, which will power upcoming
GeForce 800 series graphics cards.
The GK104 to retire soon
The GM204 is expected to replace the whole GK104 stack. It is the first silicon based on second generation Maxwell
architecture. This generation will later be updated with GK110 successor, the GM200.
According to SemiAccurate, NVIDIA is preparing at least four GeForce SKUs to be released before jumping into smaller
fabrication process. The list includes:
GeForce GTX 880 Ti
GeForce GTX 880
GeForce GTX 870
GeForce GTX 860
All these cards will differ in Streaming Multiprocessor count. Various sources claim that full GM204 chip would feature
between 15 and 20 SMs (1920-2560 CUDAs). Maxwell core is obviously much more power efficient than Kepler, thus the
next flagship will most likely have less cores than full GK110 processor. This has not happened before.
NVIDIA to skip 20nm fabrication process, third generation Maxwell to use 16nm?
Now, here’s the biggest shocker coming from SemiAccurate article. According to their sources, NVIDIA will skip 20nm
node and move straight to 16nm. Not only that, GM204 will be the first GPU remanufactured and relaunched using
this process.
NVIDIA’s GeForce GTX 880 will launch is set for Q3. If everything goes according to plan, the new flagship should appear in
October.
The second wave would launch somewhere in mid-Q1/2015, which gives us 4 to 6 months between the second and third
generation of Maxwell. Long story short, the GM204 at A stepping is expected to be the last 28nm GPU NVIDIA is going to
make. The GM204 B and all future GPUs will use 16nm node.
There are few possibilities here, and I think they are all worth to mention.
Scenario #1: GM204B cards are called the same as GM204A models
According to SA, those new graphics cards based on B revision, would still carry GTX 8xx naming scheme. In fact, all these
cards would have the same codename as their 28nm revisions.
Of course it all depends on TSMC. The volume and launch date may be affected by the difficulties met by the foundry.
In case you didn’t know, the GK110 processor was also released in B revision. It was first used by GeForce GTX 780 GHz
Edition. The transition to B stepping was rather quiet, and only a very comprehensive analysis would reveal the advantage
of GK110B. However GK110 was 28nm all the time.
Scenario #2: GM204B cards renamed
The GM204 is going to be much different. The B revision, will carry much more than just a name. Smaller node will decrease
power consumption significantly. Launching a new SKU with the same codename as its 28nm variant would cause an
absolute chaos on the market. This is where the other theory comes in. NVIDIA may add a special postfix to its naming
schema to indicate which parts are using smaller node, for instance GTX+ (like they did with 9800 GTX+), or call it Green,
GHz or Whatever Edition.
Scenario #3: GM204B cards to wait for GeForce 900 series
If TSMC 16nm yields will not be as high as expected, there is a chance that new parts would wait for GeForce 900 series. In
fact I think this is the most reasonable scenario (assuming that the whole 16nm thing is true), because nobody wants to buy
a new flagship, only later to see the same card with much lower power consumption and higher overclockability in just few
months. According to this scenario, GTX 880 would be relaunched as GTX 970, but of course using the GM204B GPU. This
theory would also leaves GeForce 1000 series for Pascal (has anyone figured how they are going to name these cards?).
_________________________________________________
Bear in mind that Charlie’s website is called semi accurate for a reason. It could all be wrong from the beginning. However
SA is exceptionally accurate where it comes to NVIDIA’s roadmaps.
Source: SemiAccurate
|
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
|
Posted: Mon, 7th Jul 2014 18:23 Post subject: |
|
 |
Meh, I refuse to upgrade until there is a 16GB version, so I can get rid of that slow-ass DDR3 RAM altogether (!)
|
|
Back to top |
|
 |
russ80
Posts: 4679
Location: Romania
|
|
Back to top |
|
 |
Sin317
Banned
Posts: 24322
Location: Geneva
|
|
Back to top |
|
 |
|
|
Back to top |
|
 |
Page 189 of 772 |
All times are GMT + 1 Hour |