Normally I'd be ripping apart Snaked, but he's not entirely wrong. Not on the first count, at least.
NVIDIA have historically always targeted the high performance crowd. Their GeForce 3 (IIRC) almost decimated all other competition for about 3 - 4 years in late 90s/early 00s, and its their reputation which has led to market domination and thus more resources to sink into producing the better quality tech, hence why their cards generally come out on top as they can afford better R&D. AMD's acquisition of ATi means they found themselves eating up a lot of the mid and low end consumer market (of course, having to compete with Intel). They have less R&D cash to play with, so they have to be smarter; what you'll notice is a lot of AMD trying to be smart and quirky (Mantle), but NVIDIA will come out with a much better implementation of what they attempted to do, if they think there's value in it.
While the data available usually lags a couple generations behind, you'll find that
NVIDIA's cards have a (slightly) lower failure rate and a longer life expectancy with their comparable AMD equivalent, and this makes senses with the above facts.
As for the idea of their being "better" CPU+GPU combinations...eh? Sort of, not really. Most GPU relations are handled by the driver set, which is all software and thus you'd be better of complaining about Microsoft/Apple/Free Software Foundation (and the various vendors for each distribution). The one component you might be thinking of is that in new architecture, the old Northbridge/Southbridge combo (which handles communication between all the parts of a PC and the CPU) are now directly integrated into the CPU (called the System Agent for Intel, not sure the AMD equivalent). Honestly though, you would think if that were the case, then your RAM, HDD and USB components would also be suffering because they're not NVIDIA/AMD branded.