GTX 870M follows a slightly different pattern, using the same GK104 core but with one SMX disabled, leaving us with 1344 cores. This is really the only chip where we won’t see any major performance improvement relative to the 700M part – we get a theoretical 20% shader performance increase and that’s about it. Otherwise, the only real change will be support for Battery Boost. The difference is that thanks to improvements in yields and other refinements, the GTX 880M will launch with a base clock of 954MHz, which is a pretty significant 20% bump over the 797MHz base clock of the GTX 780M. NVIDIA GeForce GTX 800M SpecificationsĪt the top, the GTX 880M carries on from the successful GTX 780M, using a fully enabled GK104 chip with 1536 cores.
Both "regular" (NVIDIA has dropped the "GT" branding of their mainsream parts) and GTX 800M chips are being announced today, though we of course still need these to show up in shipping laptops we’ll start at the high-end and work our way down. But before we get into the details of Battery Boost, let’s cover the various parts. The Kepler parts aren’t straight recycling of existing SKUs, however, as NVIDIA has a new feature that’s coming out with all ofthe GTX 800M parts: Battery Boost. One is that we don’t have any 800M hardware in hand for testing (yet – we should get a notebook in the near future) the second problem is that, as is typically the case, 800M will be a mix of both Kepler and Maxwell parts.
#NVIDIA GEFORCE GTX 860M SPEC SERIES#
Today’s launch of the 800M series will give us the first taste of what’s to come, but unfortunately there are two minor issues. That renewed focus on efficiency is nice and all on the desktop, but in my opinion where it’s really going to pay dividends is when we get the mobile SKUs. The result was roughly a doubling of performance per Watt, with the GTX 750 Ti being nearly twice as fast as the GTX 650 with only slightly higher power draw (and some of that most likely comes from the increased load on the rest of the system thanks to the higher frame rates). While the features may be largely the same, however, NVIDIA did come out with a renewed focus on efficiency. Last month NVIDIA launched the first of many Maxwell parts to come with the desktop GTX 750 and GTX 750 Ti, which brought a new architecture to NVIDIA’s parts, but one that isn’t radically different from the previous generation’s Kepler. You'd think they'd just stick to the simplicity of "Higher number = better.Introducing NVIDIA’s GeForce 800M Lineup for Laptops 860m is slightly better than a 950m, though I'm not sure how much Vram the OP has.Ĩ60m is better than 950m? Christ, I give up trying to understand these numeric designations. If, however, you have widespread performance problems across all programs, then it lends credence that your machine is faulty or developing a fault. If your laptop can play other graphically intense games well, but not NMS, then it isn't your rig failing. The latter, unfortunately, is the main problem with No Man's Sky. Make sure the shading cache is disabled in controls, that GSYNC is disabled in the NMS binaries in the Graphics Settings file, make sure anything in the NVIDIA control panel in regards to No Man's Sky is set for performance, when able. You should be able to do get at least that, certainly better than. I can run No Man's Sky at a consistent 30 frames per second at 1366x768, with High Textures and High Graphical Detail, Anisotropics x16. Originally posted by Macknight:My computer is less powerful than yours, with an i7 2.4ghz, Geforce GTX 860m.