Intel or AMD, Nvidia or AMD: Where does your allegiance lie, and why?
-
The original question's pretty broad, so I"ll break it into home and work.
For work, I go with AMD CPUs. They have good bang for the buck, and as a virtualization engineer, the workloads I deal with are highly multithreaded, so the more cores the merrier. At present, the Opterons have up to 16 cores to work with. For graphics, I go Nvidia, as several Quadro models are supported for use with VMware Horizon View in order to bring hardware-accelerated graphics to VDI. I don't deal with end-user hardware at all.
For home, I go with AMD CPUs. Partly it's bang for the buck, but partly it's due to history and a personal tendency towards vendor loyalty. After my 486, I moved up to a K6, and have been AMD ever since. Graphics-wise, I go with Nvidia. For some reason, I've had zero luck with ATI/AMD discrete graphics cards any time I feel adventurous and decide to give them a try. Beyond that, I was a 3dfx customer, and have kept loyalty through acquisition into Nvidia.
-
@alexntg Very much brand loyal as well. As far as AMD CPU's, you do get alot more bang for your buck, and their GPU's as of late have been quite impressive, especially the Radeon HD 7990 and the R9 290X. Both cards are extremely impressive, and until the R9 295X, the 7990 was the fastest single card for gaming in the world. But the drivers offer poor optimitazion and sad overclocking features, and every good AMD card I've seen since the 7970 wants to run at 90 degrees Celsius, and will even shut fans off to get itself to that temperature. I would absolutely pick up an R9 290X, if it wasn't for the fact that it is just miserable at supporting Direct X 11.2 Tesselation, and that it doesn't have access to any Market-Changing features like Nvidia does (Except AMD App Acceleration, a mind blowing feature when you see it in use, which is especially useful as a budget feature, as you can buy a horrid CPU and a good GPU, and in normal tasks the GPU will work as a an extra pre-processor for the CPU and boost performance quite a lot). I was really happy to see Mantle coming, as it was supposed to turn the tables, but it just didn't deliver. It raised performance by dropping the graphical fidelity - Something a user can do just by disbling V-Sync.
-
@alexntg Old 3Dfx customer here too. I still have several VooDoo cards kicking around I think.
-
@scottalanmiller Wasn't VooDoo the first to implement multi-card configurations?
-
@Mike-Ralston said:
@scottalanmiller Wasn't VooDoo the first to implement multi-card configurations?
Probably. I can't think of anyone who beat them to that.
-
It seems Nvidia absorbed them and their technology. I wish Asus would buy Nvidia Not that it would ever happen, but I wish.
-
@Mike-Ralston said:
It seems Nvidia absorbed them and their technology. I wish Asus would buy Nvidia Not that it would ever happen, but I wish.
NVidia bought 3dfx.
I doubt Asus is big enough to buy nVidia. NVidia is pretty big. They are a chip maker!
-
Never mind. Asus has a market cap of a quarter trillion dollars. Lol.
-
If Asus bought nVidia it would cause card licensing issues. Probably not worth it to them.
-
Asus does the actual manufacturing for a lot of HP's gear.
-
@scottalanmiller And are quite notable as some of the best aftermarket cooler producers for AMD and Nvidia, although EVGA usually makes the best Nvidia cards (You pay a bit more, though you're getting the best). Asus could easily buy Nvidia, Asus is HUGE. They are pretty much the top name for anyone looking to spend lots of money building a PC. If only they made RAM and PSUs... You could have an Asus/AMD color scheme all in that slick black and red, without having to worry about optimization or anything.
-
I had an old Matrox G45o 32MB video card that was dual head VGA model with converter cable wtih S-Video and Composite video jacks. I used to dl movies via Napster and connect the PC to the TV and my audio card to the home stereo unit!
@Mike-Davis I like EVGA cards for my clients, they always seem to work with no futzing.
-
Also this might be true for Intel, but I believe the issues we had with K6 and K7 processors were probably due to inferior motherboards as I was always looking for inexpensive MSI and ECS products.
-
@technobabble said:
Also this might be true for Intel, but I believe the issues we had with K6 and K7 processors were probably due to inferior motherboards as I was always looking for inexpensive MSI and ECS products.
Definitely in the K6 era people were tying cheap parts routinely to the processors. Hard to tell which was causing the issues.
-
@technobabble EVGA is straightforward. Plug it in, turn it on, and get the drivers. The Factory Superclocked ones are extremely stable and manage to get insane clock speeds. I didn't beleive it when I first saw it, and I still have trouble believeing it but my EVGA GTX 660 SC Signature model will push itself from the standard clock of 980 MHz all the way up to (If you know a ton about GPU's, this number will rock your socks) 1387 MHz. 1387 MHz! And it's Air cooled and has never gone above 70 Celsius. I'm very impressed with EVGA, I'll probably never buy a GPU from a different Manufacturer, unless it has something really special they don't. NOTE: This Pushes the GPU to be faster in ALL ways than a GTX 760. The memory will OC itself from the standard 1502 MHz to 2100Mhz.
-
@Mike-Ralston this was supposed to be your name here: @Mike-Ralston (not Mike-Davis) I like EVGA cards for my clients, they always seem to work with no futzing.
-
@technobabble Haha, I figured. You know you can edit posts, right?
-
@Mike-Ralston Yep..but not that one...the "missing edit button" strikes again!
-
@technobabble Missing edit button! Aha, that'll do it.
-
I still haven't seen that happen.