![]() End of rant.Ĭard temperature readouts have nothing to do with how much wattage of heat it's outputting into the room. What we should want is for Nvidia and AMD to continuously improve on both performance AND power efficiency. It's bad for the environment and often turns our gaming spaces into uncomfortable hotboxes. This is a big part of the reason why we don't want GPUs from each successive generation to continuously raise their TDPs, which unfortunately is exactly what it looks like is going to happen (again) with the upcoming generation. A 500W TDP graphics card using all 500W of its TDP would output exactly as much heat as a 500W space heater. Every Watt of power consumed by an electrical device will eventually output in the form of heat. Despite being downvoted, u/splepage is absolutely correct. Higher power budgets also mean more heat, so there's a balance there.Įdit: I want to address an argument that happened below. Generally, the more power budget a GPU has, the more performance it can achieve, as long as it has sufficient cooling. If you set the power limit to 110%, it will use up to 110W to get as much performance as it can, generally leading to slightly more performance but with higher power consumption and thus heat. ![]() If you set the power limit to 90%, it will only use up to 90W to get as much performance as it can, generally leading to slightly less performance but with lower power consumption and thus heat. If you set the power limit to 100%, the GPU will use all 100W to get as much performance as it can. Let's say you have a theoretical GPU with a rated TDP (thermal design power) of 100W:
0 Comments
Leave a Reply. |