Author Topic: "Powergate" - AMD RX 480 reference cards drawing too much power from PCIE slot  (Read 2230 times)

https://www.reddit.com/r/Amd/comments/4qfwd4/rx480_fails_pcie_specification/
AMD's new RX 480 graphics card failed the PCIE spec because it draws too much power from the PCIE slot on the motherboard. There are reports of people's PCIE slots and audio on their motherboards dying. RIP.
/blogland
« Last Edit: July 01, 2016, 10:26:21 AM by Mr Queeba »


Time to invest on a Nvidia graphics card.

a pci lane can only give up to 75w.
if a driver update cant solve it, then it is quite a serious issue lol.

though high end motherboards and people with good power supplies wouldnt be getting damage from it.

shoot if i hadn't loaned my parents $500 last tuesday i would've bought this card lol

i mean i've got an overkill psu and a great mobo, but regardless i'm now leaning more towards a 1070

a pci lane can only give up to 75w.
if a driver update cant solve it, then it is quite a serious issue lol.

though high end motherboards and people with good power supplies wouldnt be getting damage from it.

Supposedly theres a huge confusion about that statement.
Quote from: Reddit
Maybe i can helb you out a bit Raja. I have just read the PCI-E 3 specifications and they are telling me something different. In my understanding the 75 watt isnt the maximum limit, its just the default value on startup of the motherboard. The motherboard it self sets the maximum allowed watt per slot in the "Slot Capabilities Register" which you can configure up to over 300 watt per slot. In the bits 7 to 14 "Slot Power Limit Value" you can set 250, 275, 300 and above 300 watt.

https://www.reddit.com/r/Amd/comments/4qmlep/rx_480_powergate_problem_has_a_solution/
(Second paragraph)

why do we have to name everything "[something]gate"

why do we have to name everything "[something]gate"
Blame Richard Nixon.

You do understand that AMD is for the budget gamer and that not every thing will be perfect when first released, thats why you would typically wait a few months for all the kinks to get worked out before you invest in such a new GPU like that, think about it AMD has to invest into CPUS and GPUs so not everything they make is going to be picture perfect, AMD has done me well for the 8 years I've used them and I wouldn't say that one GPU having this problem is really that big of a deal, let them fix the issue, even nvidias had their issues.

this is why i'll wait one or more months before buying it instead of buying it the week it gets released

You do understand that AMD is for the budget gamer and that not every thing will be perfect when first released
yeah, I mean, who would ever buy a graphics card without expecting it to break your computer, right?

yeah, I mean, who would ever buy a graphics card without expecting it to break your computer, right?

>buying a reference card

You do understand that AMD is for the budget gamer and that not every thing will be perfect when first released, thats why you would typically wait a few months for all the kinks to get worked out before you invest in such a new GPU like that, think about it AMD has to invest into CPUS and GPUs so not everything they make is going to be picture perfect, AMD has done me well for the 8 years I've used them and I wouldn't say that one GPU having this problem is really that big of a deal, let them fix the issue, even nvidias had their issues.

There is a HUGE difference between a drivers issue and actual physical damage.

AMD is not a small company, they arn't totally garbage or useless. AMD has the funds and time to do a quality test, to make sure that everything is fine. I can understand if the GPU doesn't take the first or second time around cause the drivers arnt compatible with windows or something of the sort but a GPU to be released that has the potential to damage a MoBo that can easily be worth between 100-300? That right there is lack of quality control.

As far as I know, none of Nvidia's GPUs have damaged other devices on the computer. Sure they have a stuffty marketing team and sure when 3rd party people create garbage versions of the GPU (Im looking at you EVGA 960) it can damage itself but nothing that catastrophic.

a pci lane can only give up to 75w.
if a driver update cant solve it, then it is quite a serious issue lol.

though high end motherboards and people with good power supplies wouldnt be getting damage from it.

No you can pump way more wattage through a PCIE slot. Way more. 75W was a safe guard back in the days where garbage motherboards were all around that couldn't protect itself from that much power around the PCIE slot, it is not the case anymore. There's plenty of old GPUs that sucked a lot more watts through the PCIE slot than 75W.

But what AMD should've done is properly give its GPU two 6-pin connectors or a 6-pin and 8-pin connectors to allow the motherboard to breathe. It's a huge forget up from someone at AMD trying to look as cool as Nvidia with their "efficiency" not realizing that the card itself is already efficient enough, even with an extra 6 or 8 pin connector.

Why you shouldn't be worrying about the "extensive power draw" and how the media overblew this situation

TL;DW 8 GB's VRAM being consumed on a card not powerful enough to utilize it all properly, on a benchmark game that consumes the most VRAM possible, running at 4k. Ofc its gonna spike, but its far less than people thought.

Afterthought: I've heard multiple people say to just get the 4GB version. This kind of proves its a better idea.
« Last Edit: July 01, 2016, 05:31:31 PM by ShadowsfeaR »