Get Diablo IV with select GeForce RTX 40 Series, limited offer
Value:$69.99
16GB
256-Bit
GDDR6X
PCI Express 4.0 x16
850W
1 x 16-Pin
Brand | GIGABYTE |
---|---|
Series | Eagle |
Model | GV-N4080EAGLE-16GD |
Interface | PCI Express 4.0 x16 |
---|
Chipset Manufacturer | NVIDIA |
---|---|
GPU Series | NVIDIA GeForce RTX 40 Series |
GPU | GeForce RTX 4080 |
CUDA Cores | 9728 |
Memory Size | 16GB |
---|---|
Memory Interface | 256-Bit |
Memory Type | GDDR6X |
DirectX | DirectX 12 |
---|---|
OpenGL | OpenGL 4.6 |
Multi-Monitor Support | 4 |
---|---|
HDMI | 1 x HDMI 2.1 |
DisplayPort | 3 x DisplayPort 1.4a |
Max Resolution | 7680 x 4320 |
---|---|
Cooler | Triple Fans |
Thermal Design Power | 320W |
Recommended PSU Wattage | 850W |
Power Connector | 1 x 16-Pin |
Form Factor | ATX |
---|---|
Max GPU Length | 342 mm |
Slot Width | 3.5 Slots |
Date First Available | November 01, 2022 |
---|
Pros: Faster than a 3090/Ti Great for live streams 16GB VRAM (AMD has it standard) 3080 Power savings with more on it (efficiency) Anyway, coming form a 3080, I was anxious to see any improvements in games, rending and general computing. I had the MSI in mind, but that lacked lights. I am now used to lights that I need SOMETHING. After playing with the Gigabyte light, it is enough for me, and the fan lights actually bounce off of the back-plate to brighten the case up further. Plus, the app allows for GPU and MB light syncing, or adjusting each area separately with different colors and effects. I used to have NVIDIA green, now I settled for my fans having white or near-white, while the GPU had the Gigabyte logo set on temperature, which is NVIDIA green by default. It's a nice color combo and shows my system as being all Gigabyte. I will say, that unlike past GPUs, this one allows showing the BIOS screen while on HDMI 2.1. Usually, you can only used Displayport for that. I already had the driver installed, but it seemed to have been switching over for this one. It looked gloomier at first, but after a few restarts, it ended up looking a little better than the 3080. Both cards seem to dull down the colors on HDMI 2.1 @240hz, for some reason. Colors remain vivid on DP, but I don't use that because of the compression. Not that there is anything wrong with it visually, but I prefer raw imaging. After a while, HDMI colors looked good. Maybe my eyes got used to it. In general PC use, the extra VRAM started showing itself and textures, speed and pictures/video looked noticeably better - but not WORLDS better. Videos are more vivid and detailed - but I don't know if it's because the GPU is faster, or because the VRAM is more and faster, or if the speed in general maximizes pictures. In Photoshop - I noticed no improvements over the 3080. In video rendering, I have not noticed much of an improvement in speed in Vegas Pro 19 and 20, or PowerDirector 21. However, I have not experimented greatly with those on the 4080. If it means anything, the 3080 used to render at near 100%, but the 4080 only went as high as 30%! Maybe that is an example of it's speed or efficiency? Maybe the material was not as taxing to it, so it did not max it out?
Cons: PRICE SIZE!!!! VALUE is poor! NOW they get back to giving away free games - but now of our choices! Unnecessary SIZE Ray Tracing is STILL nothing special!! No games that require 16GB VRAM At these prices - they should have given away 5 games! We have to install Gefore Experience to get the game!!!!! Let me start out by writing that I have been building computers since the 90's, and this is the roughest and most time consuming add-on that I have EVER had for a PC! The main part is, this thing is way too big - for no real reason other than they did not want to build smaller coolers, plus - the larger the card, the more perception of cost and value to it! I have a full tower with a ten bay hard drive thing on the side. If the hard driver bays had not bee there, this install would not have been much of a problem, but I still had to remove a fan and unplug many cables to get this in - and I JUST got it in there! I have a Gigabyte X299X MB and the bracket to hold this thing up SHOULD have worked, but the part that hooks up to the GPU was out of place and there was not enough room to slip it under anyway. You would have thought that it would have lined up better with their own MB's. This thing was so tough - that it was almost a two man job. After resting a few times and FINALLY being able to get it to snap into place - everything JUST fit! When I say JUST fit, it JUST fit underneath the CPU fan and RAM, and it JUST fit to the edge of the GPU cooler before almost touching the case - a FULL SIZED TOWER! The only good part is - my other two used PCIE slots were still usable! There was much to say about the plugs, but if anyone could afford these overpriced GPUs, you would think that they would have enough common sense to plug something in correctly. Just to be safe, I plugged the GPU plug in BEFORE I tried to seat it and I made sure that it snapped and was unable to move out. Once seated, I slightly bent the cable about 2 inches in (very slightly) and plugged ALL THREE GPU wires up - until they locked. The only issues I had with plugs were, I forgot to reconnect some MB plugs since I had to take a few out in order to get the GPU in - mainly the primary power cable. I even forget to plug the LED, fan, audio and a SATA cable back. I never have to unplug a cable for anything else.
Overall Review: On gaming, I used my usual games that had been playing lately - GTA V Online, YoungBlood, Bioshock 2 and Control. Obviously, I used two of the games for Ray tracing and it's speed improvements over the 3080 - which seemed just fine to me in RTX speed. I will start by saying, the 4080's RTX reflections seem to be more diffused than the 3080's. The 3080's looked like a mirror-like reflection while the 4080 has noise. I hope that is not why it's faster at RTX! So, Control looked cleaner on the 4080 and everything was maxed out @1440P, 240hz. No issues whatsoever. On the 3080, I had to dial back a feature or two, but that was it. I still don't see the big deal about ray tracing and I really could not care if it was on or off! Youngblood I use for RTX demo because it is newer and more detailed. The textures are cleaner and seem more maxed out, but on the 3080, every setting was maxed out as well, with no issues. Bioshock 2 (I know - i'm late to the party), I wanted to use just to see if it did anything for it. I see refinement, but of course, that game would not show much improvement. GTA V Online is the most challenging game to run at max with no issues. After using the 3080 and 4080, I realize it is the programmers who failed at optimization. Even on the 4080, the GPU still gets hot, but you can bump up the AA and the the textures to 16X. It does not glitch or stutter. One thing for the 4080 over the 3080, is that you do not need to have a GPU shader buffer on for GTA V. I figured I could turn it off since it is supposed to be faster, with more hardware cache on it. V's maps load almost all the way and you can turn grass up to ultra with other functions, but seeing other games, it is just GTA V that taxes systems. Overall, it is an improvement and the 16GB VRAM will set you up for when they finally get true next-gen games out - which does not seem to be anytime soon! If you have a 3090/Ti - should you upgrade? I would not, to be honest with you. You have more VRAM although it pulls more power, but it is not worth selling and paying extra for a 4080. I'm being honest. By testing, the 4080 is about 17% faster than a 3090/Ti, but you give up 8GB VRAM and almost 300Gps in bandwidth! That is not worth giving up in my opinion. In addition, the 3090/Ti stills has more cores and a few other things and is in a higher class. Is it worth upgrading from a 3080? Not at these prices! From a speed point of view - go for it. From having 16GB VRAM - you could just as well got with an AMD offering since ray tracing is more gimmick than useful. I had a 2080 and I could not see it then, and I could not see it on the 3080 and 4080 either! It is useless to me. The last big graphical innovation that changed graphics was tessellation. I wish that producers would make that even better - from all angle. At best, RTX lighting, shadows and reflections COULD become a thing of the future, but the only one that you see NOW - are the reflections, and that is more of a display gimmick, than anything useful. Most of the time, the mirror images in glass are way over the top and not like real life, which is a little more subdued. It is HARD to recommend the 4080 over the 3080 at these prices and you still give up bandwidth at the benefit of speed in VRAM and GPU. Plus, games still use 8GB VRAM or less, so 10 0r 12 is still more than enough. So if you have a 3080, I do not feel that it is worth the upgrade at these prices. At the MSRP of the 3080 as it should have been - yes. Until they come out of games that require 16GB VRAM, there is no need for this. Unless RTX marketing makes you think that you need it and that you are experiencing something great - then get an AMD. I STILL cannot tell what i'm supposed to be enjoying in regards to RTX! If we can hardly see it or if what we see does not blow our minds - then we don't need it! I personally would pass on this product. I only got it because a great came into my hands, so I figured that I might as well take it! Also, some claim that you have to plug 2 GPU wires in or it won't boot. I plugged all three in and mine booted. They must have weak PSU's, so don't listen to them Weak PSU's are almost always the problem with failure to boot before anything else - especially when installing new hardware.