Card Layout and Photos

What’s that hiding in its electrostatic bag.. Don’t be shy we won’t hurt you. The GTX 680 actually came to me in the sexy packaging below, but I had to leave it in San Francisco when I left due to limited space in my bags.

image 2

box

GTX 680_Box-193-processed

GTX 680_Box-208-processed

 

Here are full cards shots of the GTX 680, even in a reference design this is a good looking card. I love the use of the raised Geforce GTX logo up top with green highlighting it even more. I hope this carries over into other cards based on the reference design. It’s nice to be able to show off what you have inside through your side panel window.

image 3

image 4

image 5

image 7

Being a flagship card this is no surprise really, but the GTX 680 is equipped with dual SLI connections. This means 3 and 4 way SLI are possible depending of the motherboard.

image 6

I have mentioned it before in this review already but here is a shot of the dual six pin power connections for the GTX 680. This is a very unique design but there is one problem that I ran into. The second connection is facing in meaning you have to have small fingers to poke down to the lower plug to hold the unlock tab on your 6 pin power connection to remove the cable. Nvidia mentioned that this layout allows them more room on the PCB for improvements and it also allows for the cards fan to be better placed.

image 8

For connections here on the reference design we have two DVI, one HDMI, and a full sized display port connection. Nvidia packs as much ventilation as possible into the design also. They have pushed the vents almost up against the DVI port and they have also added a couple vents below the HDMI plug.

image 11

image 10

The side of the reference cooler has a nice indented section around the GPU fan; this will give the card room to breathe when it is packed up against other cards or with other devices. We have run into cards in the past without this breathing room and it can get a little too hot. It’s nice that NVidia has kept that in mind.

image 12

image 9

Here is the full back of the PCB. As you can see the card does not come with a backplate but they did got with a nice black PCB. The most noticeable thing that stands out to me on the PCB here is the collection of power connections up in the top left corner. Before NVidia said that they went with the stacked power connection design but you can actually see that they could have gone with ether connection (ignoring any cooling things to consider). They also have included enough connections for an 8 pin if needed. Are these left overs from a late design change or are we seeing power connections that could be used in future cards based on the same PCB? Only time will tell.

image 13

Here are a few shots of the GTX 680 paired up with the card it is replacing, the GTX 580. As you can see the new GTX 680 is actually a little bit shorter. Having that Geforce GTX embossed on the top of the card is a nice touch, especially when next to the GTX 580.

image 14

image 15

image 16

image 17

 

Log in to comment

garfi3ld's Avatar
garfi3ld replied the topic: #24104 22 Mar 2012 17:01

titleYou have all seen and heard the rumors for months now about Nvidia’s upcoming GPU code named Kepler. When we got the call that Nvidia was inviting editors out to San Francisco to show us what they have been working on we jumped at the chance to finally put the rumors to rest and see what they really had up their sleeve. Today we can finally tell you all the gory details and dig into the performance of Nvidia’s latest flagship video card. Will it be the fastest single GPU card in the world? We finally find out!

Read more...
Dreyvas's Avatar
Dreyvas replied the topic: #24106 22 Mar 2012 17:41
Awesome. EVGA step-up process initiated. :)
L0rdG1gabyt3's Avatar
L0rdG1gabyt3 replied the topic: #24107 22 Mar 2012 17:44
When I saw that this was up, a tear formed in my eye! When I saw that the leaked specs and performance numbers were just about spot on, I laughed triumphantly! When I saw the MSRP, I just about messed my pants!

This WILL be my next video card. (And it makes it easier for me to pitch it to my wife with such a low price!)
Wingless92's Avatar
Wingless92 replied the topic: #24112 22 Mar 2012 20:04
I would wait for the 4gb cards. Might as well, double the frame buffer is always a good idea.
Myndmelt's Avatar
Myndmelt replied the topic: #24113 22 Mar 2012 20:16
They are doing a Live review over at pcper.com/live right now.
Reaper's Avatar
Reaper replied the topic: #24115 22 Mar 2012 20:26
I think I'll be picking one of these cards up. Time to let my esteemed 5870 rotate to my next lowest system. Thanks for posting your review.

Wingless92 wrote: I would wait for the 4gb cards. Might as well, double the frame buffer is always a good idea.


I don't argue with concept of double anything being bad, but the frame buffer is a bit misleading... Let's say you have a full 1080p eyefinity setup going (which obviously you won't have eyefinity with nVidia... but for discussion sake)... the resolution is 5760 x 1080 x 3 (24-bit color depth, you'll never have an alpha channel streamed) = ~17.8 MB. Most games, at best support triple-buffering, so ~54 MB. Your vertex shader inputs are going to cumulatively be more than that.

In general the only real thing you're effecting by increasing VRAM these days is how many texture maps you can fit in RAM. A ridiculous maximum texture size (8192 x 8192 x 4) still only amounts to ~333 MB of VRAM, even when mipmapped.

Just saying there's no practical reason (that I can determine) not to go ahead with one of these cards in case anyone else is on the fence.
Wingless92's Avatar
Wingless92 replied the topic: #24116 22 Mar 2012 20:54
When I had my Surround setup I know if I would have had another 3gb EVGA Classified card I would have seen better FPS with the bigger frame buffer. The game would run better with the more that it can load into the buffer. Also, now that i'm on a 30in 2560x1600 panel its still pushing the GPU's really hard. Once the 580's come down I'm going to do a third.

With everything cranked to the max in Alan Wake I am seeing right around 60FPS.

Sure, if you are on a 1080p monitor, then this is overkill. Save the cash and buy something less powerful. These cards are meant for multiple displays or 30in panels. Other than Battlefield 3 no one is playing intensive games. Most people play LoL, TF2 and DOTA2. You don't need a 680 to run these games. A 560Ti would be fine. If I had a lower end card I would be looking for the new 660Ti whenever that comes out.
garfi3ld's Avatar
garfi3ld replied the topic: #24117 22 Mar 2012 21:04
wait your looking for more than 60FPS? Whats the point in that :-P
Wingless92's Avatar
Wingless92 replied the topic: #24118 22 Mar 2012 21:08
I was happy with that. Before the 1.04 patch it would only do 30FPS.
jj_Sky5000's Avatar
jj_Sky5000 replied the topic: #24119 22 Mar 2012 22:20
The price is the most shocking & they have left the door open as well, with teh lower power consumption!!!! I think i will be adding Keplers to my collection!!!
Reaper's Avatar
Reaper replied the topic: #24120 22 Mar 2012 22:41

Wingless92 wrote: When I had my Surround setup I know if I would have had another 3gb EVGA Classified card I would have seen better FPS with the bigger frame buffer. The game would run better with the more that it can load into the buffer. Also, now that i'm on a 30in 2560x1600 panel its still pushing the GPU's really hard. Once the 580's come down I'm going to do a third.

Hrm... you'd need to clarify that all up for me before I'd give a real serious reply but just quickly and to keep it all in context of what I was sayin' earlier, a frame buffer has zero to do with frame rate on today's graphics cards due to its negligible size - it's a small drop in a huge bucket. The (primary) frame buffer is a static reserved area in VRAM based upon the currently selected monitor resolution and color depth. The GPU streams the frame buffer to the monitor (or monitors) once per VSYNC strobe, regardless of whether you have VSYNC delay turned on in your game. So frame buffer really doesn't effect frame rate.

You mentioned "had another 3gb ... card", what I'm not sure of here is whether you mean two SLI'd cards. Of course that's going to process faster but everyone knows that so not sure what you were getting at there :-D .

jj_Sky5000 wrote: The price is the most shocking & they have left the door open as well, with teh lower power consumption!!!! I think i will be adding Keplers to my collection!!!


Amen brotha... Now just time to pick a manufacturer!
Dreyvas's Avatar
Dreyvas replied the topic: #24121 22 Mar 2012 23:00
EVGA's website seems to be down right now, heh.
Myndmelt's Avatar
Myndmelt replied the topic: #24122 22 Mar 2012 23:07
Check out this video about it.
Wingless92's Avatar
Wingless92 replied the topic: #24127 23 Mar 2012 00:09
The new Precision and OC Scanner apps from EVGA are looking pretty good. I use OC Scanner all the time to test my cards out.

Anyone else look over on Newegg and see that PNY was trolling and asking $20 more for their cards? I'll take EVGA until they go downhill which I hope will be never.
Reaper's Avatar
Reaper replied the topic: #24130 23 Mar 2012 00:26

Wingless92 wrote: Anyone else look over on Newegg and see that PNY was trolling and asking $20 more for their cards? I'll take EVGA until they go downhill which I hope will be never.


It's actually $30 (right now) ... but yeah I am angling towards picking up an EVGA. Several NewEgg gift cards ready to be used up :-)
Arxon's Avatar
Arxon replied the topic: #24158 23 Mar 2012 09:53
Sadly this card isn't much better than a 7970. Wait till ATi/AMD come out with the 7990.

"AMD is expected to release its flagship dual-GPU graphics card based on the "Tahiti" platform in 2012. We are learning under the codename "New Zealand", AMD is working on the monster HD 7990, which will utilize two HD 7970's and 6GB of total graphics memory (3 GB per GPU system). If AMD will indeed go for the HD 7970, this could mean that the card will include 62 compute units for a total of 4096 stream processors, 256 texture units and 64 full color ROPs."
garfi3ld's Avatar
garfi3ld replied the topic: #24159 23 Mar 2012 10:32
Arxon you have to remember two things. A two GPU card isn't "topping" Nvidia, they can do the same thing. Second, not being "much" better is still a large gap considering the price difference right at launch.

This isn't coming from someone with any bias, I gave two awards to AMD cards this week even.
garfi3ld's Avatar
garfi3ld replied the topic: #24160 23 Mar 2012 10:37
Also I think the Nvidia drivers, although much more mature and stable than what AMD launched with, still have a lot of room for improvement with ingame performance. 3DMark shows that the card is capable of. I wouldn't be shocked if we saw major improvements in some games by the time we get around to cards like the GTX 670 or 660.
Arxon's Avatar
Arxon replied the topic: #24161 23 Mar 2012 10:56
I didn't realize the 7990 had to gpus when i made the first comment then i read about it. As for price from what I see $50 cheaper but the ATI has more stream processors faster ram but a 8ghz slower core on stock models. Not much of a gain for 50 but i wouldn't have to buy a second one for triple monitors. Unless Nvidia changed that.(just watched that video and at the end they did change it to where you only need one card.)
garfi3ld's Avatar
garfi3ld replied the topic: #24162 23 Mar 2012 11:02
Nvidia changed that.. And I Wouldn't pay the same price let alone more for a GPU that is slower. No matter how many stream processors it has. :-P

We will have to revisit everyone once the dust settles
Arxon's Avatar
Arxon replied the topic: #24163 23 Mar 2012 11:05
Yeah, I wanna see the 690 now to see if it is gonna be like the 7990 with dual 680s
THUMPer's Avatar
THUMPer replied the topic: #24173 23 Mar 2012 18:14
Wheres the GPU Compute performance? After all, Nvidia created the GPGPU market and CUDA programming environment. NV can tweak their drivers for games and benchmarks, that's obvious, but they shat where they sleep.
AMD needs to bring their prices down or they won't sell anything. I would like to see more info on the gpu boost.
Wingless92's Avatar
Wingless92 replied the topic: #24175 23 Mar 2012 18:18
I'm liking the new card. Is it going to make me run out and buy 3 new cards? Na, i'll just buy another 580, run tri-SLi and be set for a long time.
THUMPer's Avatar
THUMPer replied the topic: #24176 23 Mar 2012 18:35
L0rdG1gabyt3's Avatar
L0rdG1gabyt3 replied the topic: #24179 23 Mar 2012 18:48

THUMPer wrote: Wheres the GPU Compute performance? After all, Nvidia created the GPGPU market and CUDA programming environment. NV can tweak their drivers for games and benchmarks, that's obvious, but they shat where they sleep.
AMD needs to bring their prices down or they won't sell anything. I would like to see more info on the gpu boost.

According to the live stream over at pcper yesterday.... the nvidia rep said that the compute performance is going to be more focused on the Quattro line as a way to differentiate the two lines of cards. Its sad.

On another note.... the 680s seem to scale pretty well!


We have 3648 guests and no members online

supportus