User:Terasail/sandbox/article: Difference between revisions
Appearance
< User:Terasail | sandbox
Content deleted Content added
→GeForce 500 series: Convert to overview table |
Create |
||
(43 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
== |
==Officer ranks== |
||
===Army=== |
|||
=== |
===Navy=== |
||
{{further|Fahrenheit (microarchitecture)}} |
|||
===Marines=== |
|||
{|class="wikitable" |
|||
!colspan=11|Fahrenheit GPU List |
|||
|- |
|||
!Model name |
|||
|[[NV1|STG-2000]] |
|||
|[[Riva 128]] |
|||
|[[Riva 128]]ZX |
|||
|[[Riva TNT]] |
|||
|Vanta |
|||
|Vanta LT |
|||
|Riva TNT2 M64 |
|||
|[[Riva TNT2]] |
|||
|[[Riva TNT2]] Pro |
|||
|[[Riva TNT2]] Ultra |
|||
|- |
|||
!Launch date |
|||
|May 22, 1995 |
|||
|August 25, 1997 |
|||
|February 23, 1998 |
|||
|June 15, 1998 |
|||
|March 22, 1999 |
|||
|March 2000 |
|||
|October 1999 |
|||
|March 15, 1999 |
|||
|October 12, 1999 |
|||
|March 15, 1999 |
|||
|} |
|||
===Air Force=== |
|||
{{notelist}} |
|||
=== |
===Space Force=== |
||
{{Further|GeForce 256|Celsius (microarchitecture)}} |
|||
==Enlisted and Other ranks== |
|||
* All models are made via [[TSMC]] [[250 nm process|220 nm]] fabrication process |
|||
===Army=== |
|||
* All models support [[Direct3D]] 7.0 and [[OpenGL]] 1.2 |
|||
* All models support hardware Transform and Lighting (T&L) and Cube Environment Mapping |
|||
===Navy=== |
|||
{|class="wikitable" |
|||
!colspan=3|GeForce 256 GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 256 SDR |
|||
|GeForce 256 DDR |
|||
|- |
|||
!Launch date |
|||
|Oct 11, 1999 |
|||
|Dec 13, 1999 |
|||
|} |
|||
{{notelist}} |
|||
=== |
===Marines=== |
||
{{Further|GeForce 2 series|Celsius (microarchitecture)}} |
|||
===Air Force=== |
|||
* All models support [[Direct3D]] 7 and [[OpenGL]] 1.2 |
|||
* All models support TwinView Dual-Display Architecture, Second Generation Transform and Lighting (T&L), Nvidia Shading Rasterizer (NSR), High-Definition Video Processor (HDVP) |
|||
* GeForce2 MX models support Digital Vibrance Control (DVC) |
|||
{|class="wikitable" |
|||
!colspan=9|GeForce 2 series GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce2 MX IGP + nForce 220/420 |
|||
|GeForce2 MX200 |
|||
|GeForce2 MX |
|||
|GeForce2 MX400 |
|||
|GeForce2 GTS |
|||
|GeForce2 Pro |
|||
|GeForce2 Ti |
|||
|GeForce2 Ultra |
|||
|- |
|||
!Launch date |
|||
|June 4, 2001 |
|||
|March 3, 2001 |
|||
|June 28, 2000 |
|||
|March 3, 2001 |
|||
|April 26, 2000 |
|||
|December 5, 2000 |
|||
|October 1, 2001 |
|||
|August 14, 2000 |
|||
|} |
|||
=== |
===Space Force=== |
||
{{Further|GeForce 3 series|Kelvin (microarchitecture)}} |
|||
==See also== |
|||
* All models are made via [[TSMC]] [[180 nm process|150 nm]] fabrication process |
|||
* [[List of comparative military ranks]] |
|||
* All models support [[Direct3D]] 8.0 and [[OpenGL]] 1.3 |
|||
* [[Ranks and insignia of NATO]] |
|||
* All models support 3D Textures, Lightspeed Memory Architecture (LMA), nFiniteFX Engine, Shadow Buffers |
|||
* [[British Army officer rank insignia]] |
|||
{|class="wikitable" |
|||
* [[British Army other ranks rank insignia]] |
|||
!colspan=4|GeForce3 series GPU List |
|||
* [[U.S. Army officer rank insignia]] |
|||
|- |
|||
* [[U.S. Army enlisted rank insignia]] |
|||
!Model name |
|||
* [[Military rank]] |
|||
|GeForce3 Ti200 |
|||
|GeForce3 |
|||
|GeForce3 Ti500 |
|||
|- |
|||
!Launch date |
|||
|October 1, 2001 |
|||
|February 27, 2001 |
|||
|October 1, 2001 |
|||
|} |
|||
== References == |
|||
===GeForce4 series=== |
|||
{{Reflist}} |
|||
{{Further|GeForce 4 series|Kelvin (microarchitecture)}} |
|||
==External links== |
|||
* All models are manufactured via [[TSMC]] [[180 nm process|150 nm]] manufacturing process |
|||
* [http://www.militaria.lv/stanag.htm NATO codes for grades of military personnel] in [[STANAG]] 2116 |
|||
* All models support Accuview Antialiasing (AA), Lightspeed Memory Architecture II (LMA II), nView |
|||
{|class="wikitable" |
|||
!colspan=15|GeForce 4 series GPU List |
|||
|-style="font-size:85%" |
|||
!Model name |
|||
|GeForce4 MX IGP + nForce2 |
|||
|GeForce4 MX420 |
|||
|GeForce4 MX440 SE |
|||
|GeForce MX4000 |
|||
|GeForce PCX4300 |
|||
|GeForce4 MX440 |
|||
|GeForce4 MX440 8x |
|||
|GeForce4 MX460 |
|||
|GeForce4 Ti4200 |
|||
|GeForce4 Ti4200 8x |
|||
|GeForce4 Ti4400 |
|||
|GeForce4 Ti4400 8x (Ti4800SE) |
|||
|GeForce4 Ti4600 |
|||
|GeForce4 Ti4600 8x (Ti4800) |
|||
|-style="font-size:85%" |
|||
!Launch date |
|||
|October 1, 2002 |
|||
|February 6, 2002 |
|||
|2002 |
|||
|December 14, 2003 |
|||
|February 19, 2004 |
|||
|February 6, 2002 |
|||
|September 25, 2002 |
|||
|February 6, 2002 |
|||
|April 16, 2002 |
|||
|September 25, 2002 |
|||
|February 6, 2002 |
|||
|January 20, 2003 |
|||
|February 6, 2002 |
|||
|January 20, 2003 |
|||
|} |
|||
{{Military ranks by country}} |
|||
===GeForce FX (5xxx) series=== |
|||
{{Further|GeForce FX series|Rankine (microarchitecture)}} |
|||
* All models support [[Direct3D]] 9.0a and [[OpenGL]] 1.5 (2.1 (software) with latest drivers) |
|||
* The GeForce FX series runs vertex shaders in an array |
|||
{|class="wikitable" style="text-align:center;" |
|||
!colspan=15|GeForce FX (5100–5700) GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce FX 5100 |
|||
|GeForce FX 5200 LE |
|||
|GeForce FX 5200 |
|||
|GeForce FX 5200 Ultra |
|||
|GeForce PCX 5300 |
|||
|GeForce FX 5500 |
|||
|GeForce FX 5600 XT |
|||
|GeForce FX 5600 |
|||
|GeForce FX 5600 Ultra |
|||
|GeForce FX 5600 Ultra Rev.2 |
|||
|GeForce FX 5700 VE |
|||
|GeForce FX 5700 LE |
|||
|GeForce FX 5700 |
|||
|- |
|||
!Launch date |
|||
|colspan=3|March 2003 |
|||
|March 6, 2003 |
|||
|March 17, 2004 |
|||
|March 2004 |
|||
|October 2003 |
|||
|March 2003 |
|||
|colspan=2|March 6, 2003 |
|||
|September 2004 |
|||
|March 2004 |
|||
|2003 |
|||
|- |
|||
!colspan=15|GeForce FX (5750–5950) GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce PCX 5750 |
|||
|GeForce FX 5700 Ultra |
|||
|GeForce FX 5700 Ultra GDDR3 |
|||
|GeForce FX 5800 |
|||
|GeForce FX 5800 Ultra |
|||
|GeForce FX 5900 ZT |
|||
|GeForce FX 5900 XT |
|||
|GeForce FX 5900 |
|||
|GeForce FX 5900 Ultra |
|||
|GeForce PCX 5900 |
|||
|GeForce FX 5950 Ultra |
|||
|GeForce PCX 5950 |
|||
|- |
|||
!Launch date |
|||
|March 17, 2004 |
|||
|October 23, 2003 |
|||
|March 15, 2004 |
|||
|colspan=2|January 27, 2003 |
|||
|December 15, 2003 |
|||
|December 15, 2003 |
|||
|May 2003 |
|||
|May 12, 2003 |
|||
|March 17, 2004 |
|||
|October 23, 2003 |
|||
|February 17, 2004 |
|||
|} |
|||
===GeForce 6 (6xxx) series=== |
|||
{{Further|GeForce 6 series|Curie (microarchitecture)}} |
|||
* All models support [[Direct3D]] 9.0c and [[OpenGL]] 2.1 |
|||
* All models support Transparency [[Spatial anti-aliasing|AA]] (starting with version 91.47 of the ForceWare drivers) and PureVideo |
|||
{|class="wikitable" style="text-align:center;" |
|||
!colspan=11|GeForce 6 (6100–6600) series GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 6100 + nForce 410 |
|||
|GeForce 6150 SE + nForce 430 |
|||
|GeForce 6150 LE + nForce 430 |
|||
|GeForce 6150 + nForce 430 |
|||
|GeForce 6200 LE |
|||
|GeForce 6200A |
|||
|GeForce 6200 |
|||
|GeForce 6200 TurboCache |
|||
|GeForce 6500 |
|||
|GeForce 6600 LE |
|||
|- |
|||
!rowspan=2|Launch date |
|||
|rowspan=2|October 20, 2005 |
|||
|rowspan=2 colspan=2|June 2006 |
|||
|rowspan=2|October 20, 2005 |
|||
|rowspan=2|2005 |
|||
|rowspan=2|April 4, 2005 |
|||
|October 12, 2004 (PCIe) |
|||
|rowspan=2|December 15, 2004 |
|||
|rowspan=2|October 1, 2005 |
|||
|rowspan=2|2005 |
|||
|- |
|||
|January 17, 2005 (AGP) |
|||
|- |
|||
!colspan=11|GeForce 6 (6600–6800) series GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 6600 |
|||
|GeForce 6600 GT |
|||
|GeForce 6800 LE |
|||
|GeForce 6800 XT |
|||
|GeForce 6800 |
|||
|GeForce 6800 GTO |
|||
|GeForce 6800 GS |
|||
|GeForce 6800 GT |
|||
|GeForce 6800 Ultra |
|||
|GeForce 6800 Ultra Extreme Edition |
|||
|- |
|||
!rowspan=3|Launch date |
|||
|rowspan=3|August 12, 2004 |
|||
|August 12, 2004 (PCIe) |
|||
|January 16, 2005 (PCIe) |
|||
|rowspan=3|September 30, 2005 |
|||
|November 8, 2004 (PCIe) |
|||
|rowspan=3|April 14, 2004 |
|||
|November 7, 2005 (PCIe) |
|||
|June 28, 2004 (PCIe) |
|||
|June 28, 2004 (PCIe) |
|||
|rowspan=3|May 4, 2004 |
|||
|- |
|||
|rowspan=2|November 14, 2004 (AGP) |
|||
|rowspan=2|July 22, 2004 (AGP) |
|||
|rowspan=2|April 14, 2004 (AGP) |
|||
|rowspan=2|December 8, 2005 (AGP) |
|||
|rowspan=2|May 4, 2004 (AGP) |
|||
|May 4, 2004 (AGP) |
|||
|- |
|||
|March 14, 2005 (512 MB) |
|||
|} |
|||
===GeForce 7 (7xxx) series=== |
|||
{{Further|GeForce 7 series|Curie (microarchitecture)}} |
|||
* All models support [[Direct3D]] 9.0c and [[OpenGL]] 2.1 |
|||
* All models support Transparency [[Spatial anti-aliasing|AA]] (starting with version 91.47 of the ForceWare drivers) |
|||
{|class="wikitable" style="text-align:center;" |
|||
!colspan=10|GeForce 7 (7025 – 7300 LE) GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 7025 + nForce 630a |
|||
|GeForce 7050PV + nForce 630a |
|||
|GeForce 7050 + nForce 610i/630i |
|||
|GeForce 7100 + nForce 630i |
|||
|GeForce 7150 + nForce 630i |
|||
|GeForce 7100 GS |
|||
|GeForce 7200 GS |
|||
|GeForce 7300 SE |
|||
|GeForce 7300 LE |
|||
|- |
|||
!Launch date |
|||
|colspan=5|July 2007 |
|||
|August 8, 2006 |
|||
|January 18, 2006 |
|||
|colspan=2|March 22, 2006 |
|||
|- |
|||
!colspan=10|GeForce 7 (7300 GS–8700 GT) GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 7300 GS |
|||
|GeForce 7300 GT |
|||
|GeForce 7500 LE |
|||
|GeForce 7600 GS |
|||
|GeForce 7600 GT |
|||
|GeForce 7600 GT 80 nm |
|||
|GeForce 7650 GS |
|||
|GeForce 7800 GS |
|||
|GeForce 7800 GT |
|||
|- |
|||
!Launch date |
|||
|January 18, 2006 |
|||
|May 15, 2006 |
|||
|2006 |
|||
|March 22, 2006 (PCIe) |
|||
July 1, 2006 (AGP) |
|||
|March 9, 2006 (PCIe) |
|||
July 15, 2006 (AGP) |
|||
|January 8, 2007 |
|||
|March 22, 2006 |
|||
|February 2, 2006 |
|||
|August 11, 2005 |
|||
|- |
|||
!colspan=10|GeForce 7 (7800 GTX–7950 GX2) GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 7800 GTX |
|||
|GeForce 7900 GS |
|||
|GeForce 7900 GT |
|||
|GeForce 7900 GTO |
|||
|GeForce 7900 GTX |
|||
|GeForce 7900 GX2 |
|||
|GeForce 7950 GT |
|||
|GeForce 7950 GX2 |
|||
|- |
|||
!Launch date |
|||
|June 22, 2005 (256 MB) |
|||
November 14, 2005 (512 MB) |
|||
|May 2006 (PCIe) |
|||
April 2, 2007 (AGP) |
|||
|colspan=2|March 9, 2006 |
|||
|October 1, 2006 |
|||
|March 9, 2006 |
|||
|September 6, 2006 (PCIe) |
|||
April 2, 2007 (AGP) |
|||
|June 5, 2006 |
|||
|} |
|||
===GeForce 8 (8xxx) series=== |
|||
{{Further|GeForce 8 series|Tesla (microarchitecture)}} |
|||
* All models support coverage sample anti-aliasing, angle-independent anisotropic filtering, and 128-bit OpenEXR HDR. |
|||
{|class="wikitable" style="text-align:center;" |
|||
!colspan=10|GeForce 8 (8100–1600 GS) GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 8100 mGPU |
|||
|GeForce 8200 mGPU |
|||
|GeForce 8300 mGPU |
|||
|GeForce 8300 GS |
|||
|GeForce 8400 GS |
|||
|GeForce 8400 GS rev.2 |
|||
|GeForce 8400 GS rev.3 |
|||
|GeForce 8500 GT |
|||
|GeForce 8600 GS |
|||
|- |
|||
!Launch date |
|||
|colspan=3|2008 |
|||
|July 2007 |
|||
|June 15, 2007 |
|||
|December 10, 2007 |
|||
|July 12, 2010 |
|||
|April 17, 2007 |
|||
|April 2007 |
|||
|- |
|||
!colspan=10|GeForce 8 (8600 GT–8800) GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 8600 GT |
|||
|GeForce 8600 GTS |
|||
|GeForce 8800 GS |
|||
|GeForce 8800 GTS (G80) |
|||
|GeForce 8800 GTS 112 (G80) |
|||
|GeForce 8800 GT |
|||
|GeForce 8800 GTS (G92) |
|||
|GeForce 8800 GTX |
|||
|GeForce 8800 Ultra |
|||
|- |
|||
!Launch date |
|||
|colspan=2|April 17, 2007 |
|||
|January 2008 |
|||
|February 12, 2007 (320) |
|||
November 8, 2006 (640) |
|||
|November 19, 2007 |
|||
|October 29, 2007 (512) |
|||
December 11, 2007 (256, 1024) |
|||
|December 11, 2007 |
|||
|November 8, 2006 |
|||
|May 2, 2007 |
|||
|} |
|||
===GeForce 9 (9xxx) series=== |
|||
{{Further|GeForce 9 series|Tesla (microarchitecture)}} |
|||
* All models support Coverage Sample Anti-Aliasing, Angle-Independent Anisotropic Filtering, 128-bit OpenEXR HDR |
|||
{|class="wikitable" style="text-align:center;" |
|||
!colspan=9|GeForce 9 (9300–9600 GSO) GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 9300 mGPU |
|||
|GeForce 9400 mGPU |
|||
|GeForce 9300 GE |
|||
|GeForce 9300 GS |
|||
|GeForce 9400 GT |
|||
|GeForce 9500 GT |
|||
|GeForce 9600 GS |
|||
|GeForce 9600 GSO |
|||
|- |
|||
!Launch date |
|||
|colspan=2|October 2008 |
|||
|colspan=2|June 2008 |
|||
|August 27, 2008 |
|||
|colspan=2|July 29, 2008 |
|||
|May 2008 |
|||
|- |
|||
!colspan=9|GeForce 9 (9600 GSO 512–9800) GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce 9600 GSO 512 |
|||
|GeForce 9600 GT Green Edition |
|||
|GeForce 9600 GT |
|||
|GeForce 9800 GT Green Edition |
|||
|GeForce 9800 GT |
|||
|GeForce 9800 GTX |
|||
|GeForce 9800 GTX+ |
|||
|GeForce 9800 GX2 |
|||
|- |
|||
!Launch date |
|||
|October 2008 |
|||
|2009 |
|||
|February 21, 2008 |
|||
|2009 |
|||
|July 2008 |
|||
|April 1, 2008 |
|||
|July 16, 2008 |
|||
|March 18, 2008 |
|||
|} |
|||
===GeForce 100 series=== |
|||
{{Further|GeForce 100 series|Tesla (microarchitecture)}} |
|||
{|class="wikitable" style="text-align:center;" |
|||
!colspan=6|GeForce 100 series GPU List |
|||
|- |
|||
!Model name |
|||
|GeForce G 100 |
|||
|GeForce GT 120 |
|||
|GeForce GT 130 |
|||
|GeForce GT 140 |
|||
|GeForce GTS 150 |
|||
|- |
|||
!Launch date |
|||
|colspan=5|March 10, 2009 |
|||
|} |
|||
===GeForce 200 series=== |
|||
{{Further|GeForce 200 series|Tesla (microarchitecture)}} |
|||
* All models support Coverage Sample Anti-Aliasing, Angle-Independent Anisotropic Filtering, 240-bit OpenEXR HDR |
|||
{|class="wikitable" style="text-align:center;" |
|||
!Model name |
|||
!Launch date |
|||
!Release Price (USD) |
|||
|- |
|||
|GeForce 205 |
|||
|November 26, 2009 |
|||
|- |
|||
|GeForce 210 |
|||
|rowspan=2|October 12, 2009 |
|||
|- |
|||
|GeForce GT 220 |
|||
|- |
|||
|rowspan=2|GeForce GT 230 |
|||
|October 12, 2009 |
|||
|- |
|||
|April 27, 2009 |
|||
|- |
|||
|GeForce GT 240 |
|||
|November 17, 2009 |
|||
|- |
|||
|GeForce GTS 240 |
|||
|July 1, 2009 |
|||
|- |
|||
|rowspan=2|GeForce GTS 250 |
|||
|2009 |
|||
|- |
|||
|March 3, 2009 |
|||
|- |
|||
|rowspan=2|GeForce GTX 260 |
|||
|June 16, 2008 |
|||
|- |
|||
|September 16, 2008 |
|||
November 27, 2008 (55 nm) |
|||
|- |
|||
|GeForce GTX 275 |
|||
|April 9, 2009 |
|||
|- |
|||
|GeForce GTX 280 |
|||
|June 17, 2008 |
|||
|- |
|||
|GeForce GTX 285 |
|||
|January 15, 2009 |
|||
|- |
|||
|GeForce GTX 295 |
|||
|January 8, 2009 |
|||
|} |
|||
===GeForce 300 series=== |
|||
{{Further|GeForce 300 series|Tesla (microarchitecture)}} |
|||
* All models support the following [[Application programming interface|API]] levels: [[Direct3D]] 10.1 and [[OpenGL]] 3.3 |
|||
{|class="wikitable" style="text-align:center;" |
|||
!Model name |
|||
!Launch date |
|||
|- |
|||
|GeForce 310 |
|||
|November 27, 2009 |
|||
|- |
|||
|GeForce 315 |
|||
|rowspan=4|February 2010 |
|||
|- |
|||
|GeForce GT 320 |
|||
|- |
|||
|GeForce GT 330 |
|||
|- |
|||
|GeForce GT 340 |
|||
|} |
|||
===GeForce 400 series=== |
|||
{{Further|GeForce 400 series|Fermi (microarchitecture)}} |
|||
* All cards have a PCIe 2.0 x16 [[Computer bus|Bus]] [[I/O interface|interface]]. |
|||
* The base requirement for Vulkan 1.0 in terms of hardware features was OpenGL ES 3.1 which is a subset of OpenGL 4.3, which is supported on all Fermi and newer cards. |
|||
* Memory bandwidths stated in the following table refer to Nvidia reference designs. Actual bandwidth can be higher or ''lower'' depending on the maker of the graphic board. |
|||
{|class="wikitable" style="text-align:center;" |
|||
!Model name |
|||
!Launch date |
|||
!TDP (Watts) |
|||
!Release Price (USD) |
|||
|- |
|||
!GeForce 405 |
|||
|September 16, 2011 |
|||
|30.5 |
|||
|rowspan=3|OEM |
|||
|- |
|||
!GeForce GT 420 |
|||
|September 3, 2010 |
|||
|50 |
|||
|- |
|||
!rowspan=2|GeForce GT 430 |
|||
|rowspan=2|October 11, 2010 |
|||
|60 |
|||
|- |
|||
|49 |
|||
|$79 |
|||
|- |
|||
!rowspan=2|GeForce GT 440 |
|||
|October 11, 2010 |
|||
|56 |
|||
|OEM |
|||
|- |
|||
|February 1, 2011 |
|||
|65 |
|||
|$100 |
|||
|- |
|||
!rowspan=2|GeForce GTS 450 |
|||
|October 11, 2010 |
|||
|rowspan=2|106 |
|||
|OEM |
|||
|- |
|||
|September 13, 2010<br>March 15, 2011 |
|||
|$129 |
|||
|- |
|||
!GeForce GTX 460 SE |
|||
|November 15, 2010 |
|||
|150 |
|||
|$160 |
|||
|- |
|||
!rowspan=4|GeForce GTX 460 |
|||
|October 11, 2010 |
|||
|rowspan=2|150 |
|||
|OEM |
|||
|- |
|||
|rowspan=2|July 12, 2010 |
|||
|$199 |
|||
|- |
|||
|rowspan=2|160 |
|||
|$229 |
|||
|- |
|||
|September 24, 2011 |
|||
|$199 |
|||
|- |
|||
!GeForce GTX 465 |
|||
|May 31, 2010 |
|||
|200 |
|||
|$279 |
|||
|- |
|||
!GeForce GTX 470 |
|||
|March 26, 2010 |
|||
|215 |
|||
|$349 |
|||
|- |
|||
!GeForce GTX 480 |
|||
|March 26, 2010 |
|||
|250 |
|||
|$499 |
|||
|} |
|||
===GeForce 500 series=== |
|||
{{Further|GeForce 500 series|Fermi (microarchitecture)}} |
|||
{|class="wikitable" style="text-align:center;" |
|||
!Model name |
|||
!Launch date |
|||
!TDP (Watts) |
|||
!Release Price (USD) |
|||
|- |
|||
!GeForce 510 |
|||
|September 29, 2011 |
|||
|25 |
|||
|OEM |
|||
|- |
|||
!GeForce GT 520 |
|||
|April 12, 2011 |
|||
|29 |
|||
|$59 |
|||
|- |
|||
!GeForce GT 530 |
|||
|rowspan=3|May 14, 2011 |
|||
|50 |
|||
|rowspan=2|OEM |
|||
|- |
|||
!rowspan=2|GeForce GT 545 |
|||
|105 |
|||
|- |
|||
|70 |
|||
|$149 |
|||
|- |
|||
!GeForce GTX 550 Ti |
|||
|March 15, 2011 |
|||
|116 |
|||
|$149 |
|||
|- |
|||
!GeForce GTX 555 |
|||
|May 14, 2011 |
|||
|rowspan=3|150 |
|||
|rowspan=2|OEM |
|||
|- |
|||
!GeForce GTX 560 SE |
|||
|February 20, 2012 |
|||
|- |
|||
!GeForce GTX 560 |
|||
|May 17, 2011 |
|||
|$199 |
|||
|- |
|||
!rowspan=2|GeForce GTX 560 Ti |
|||
|May 30, 2011 |
|||
|210 |
|||
|OEM |
|||
|- |
|||
|January 25, 2011 |
|||
|170 |
|||
|$249 |
|||
|- |
|||
!GeForce GTX 560 Ti 448 Cores |
|||
|November 29, 2011 |
|||
|210 |
|||
|$289 |
|||
|- |
|||
!GeForce GTX 570 |
|||
|December 7, 2010 |
|||
|219 |
|||
|$349 |
|||
|- |
|||
!GeForce GTX 580 |
|||
|November 9, 2010 |
|||
|244 |
|||
|$499 |
|||
|- |
|||
!GeForce GTX 590 |
|||
|March 24, 2011 |
|||
|365 |
|||
|$699 |
|||
|} |
|||
===GeForce 600 series=== |
|||
{{Further|GeForce 600 series|Kepler (microarchitecture)}} |
|||
* Add [[Nvidia NVENC|NVENC]] on GTX cards |
|||
* Several 600 series cards are rebranded 400 or 500 series cards. |
|||
{| class="wikitable" style="font-size: 80%; text-align: center;" |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Fab ([[Nanometer|nm]]) |
|||
! rowspan="2" | Transistors (million) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | [[Computer bus|Bus]] [[I/O interface|interface]] |
|||
! rowspan="2" | SM count |
|||
! rowspan="2" | Core config{{efn|name=geforce 600 1|[[Unified shader model|Unified shaders]]: [[texture mapping unit]]s: [[render output unit]]s}} |
|||
! colspan="5" | Clock rate |
|||
! colspan="2" | [[Fillrate]] |
|||
! colspan="4" | Memory configuration |
|||
! colspan="4" | Supported [[Application programming interface|API]] version |
|||
! colspan="2" | Processing power ([[GFLOPS]]){{efn|name=geforce 600 10|To calculate the processing power see [[Kepler (microarchitecture)#Performance]], or [[Fermi (microarchitecture)#Performance]].}} |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | Release Price (USD) |
|||
|- |
|||
! Core ([[Hertz|MHz]]) |
|||
! Average Boost ([[Hertz|MHz]]) |
|||
! Max Boost ([[Hertz|MHz]]) |
|||
! Shader ([[Hertz|MHz]]) |
|||
! Memory ([[Hertz|MHz]]) |
|||
! Pixel ([[Pixel|GP]]/s) |
|||
! Texture ([[Texel (graphics)|GT]]/s) |
|||
! Size ([[Megabyte|MB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! DRAM type |
|||
! Bus width ([[bit]]) |
|||
! [[Vulkan (API)|Vulkan]]{{efn|name=geforce 600 11|Vulkan 1.2 is only supported on Kepler cards.<ref name="vulkandrv" />}} |
|||
! [[Direct3D]] |
|||
! [[OpenGL]] |
|||
! [[OpenCL]] |
|||
! [[Single precision floating-point format|Single precision]] |
|||
! [[Double precision floating-point format|Double precision]] |
|||
|- |
|||
! style="text-align:left;" | GeForce 605{{efn|name=geforce 600 2|The GeForce 605 (OEM) card is a rebranded GeForce 510.}} |
|||
| April 3, 2012 |
|||
| GF119 |
|||
| rowspan="5" |[[TSMC]] [[40 nm]] |
|||
| rowspan="3" | 292 |
|||
| rowspan="3" | 79 |
|||
| PCIe 2.0 x16 |
|||
| rowspan="3" | 1 |
|||
| 48:8:4 |
|||
| 523 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 1046 |
|||
| 898<br />(1796) |
|||
| 2.09 |
|||
| 4.2 |
|||
| 512 1024 |
|||
| 14.4 |
|||
| rowspan="7" | DDR3 |
|||
| rowspan="5" | 64 |
|||
| rowspan="5" {{N/a}} |
|||
| rowspan="27" | 12 |
|||
| rowspan="27" | 4.6 |
|||
| rowspan="27" | 1.2 |
|||
| 100.4 |
|||
| {{unk}} |
|||
| 25 |
|||
| OEM |
|||
|- |
|||
! style="text-align:left;" | GeForce GT 610{{efn|name=geforce 600 3|The GeForce GT 610 card is a rebranded GeForce GT 520.}} |
|||
| May 15, 2012 |
|||
| GF119-300-A1 |
|||
| PCIe 2.0 x16, PCIe x1, PCI |
|||
| 48:8:4 |
|||
| rowspan="2" | 810 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| rowspan="2" | 1620 |
|||
| 1000<br />1800 |
|||
| rowspan="2" | 3.24 |
|||
| 6.5 |
|||
| 512<br />1024<br />2048 |
|||
| 8<br />14.4 |
|||
| 155.5 |
|||
| {{unk}} |
|||
| 29 |
|||
| Retail |
|||
|- |
|||
! rowspan="2" style="text-align:left;" | GeForce GT 620{{efn|name=geforce 600 4|The GeForce GT 620 (OEM) card is a rebranded GeForce GT 520.}} |
|||
| April 3, 2012 |
|||
| GF119 |
|||
| rowspan="3" | PCIe 2.0 x16 |
|||
| 48:8:4 |
|||
|{{N/a}} |
|||
|{{N/a}} |
|||
| 898<br />(1796) |
|||
| 6.5 |
|||
| 512<br />1024 |
|||
| 14.4 |
|||
| 155.5 |
|||
| {{unk}} |
|||
| 30 |
|||
| OEM |
|||
|- |
|||
| May 15, 2012 |
|||
| GF108-100-KB-A1 |
|||
| 585 |
|||
| 116 |
|||
| 2 |
|||
| 96:16:4 |
|||
| 700 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 1400 |
|||
| 1000–1800 |
|||
| 2.8 |
|||
| 11.2 |
|||
| 1024<br />2048 |
|||
| 8–14.4 |
|||
| 268.8 |
|||
| {{unk}} |
|||
| 49 |
|||
| Retail |
|||
|- |
|||
! style="text-align:left;" | GeForce GT 625 |
|||
| February 19, 2013 |
|||
| GF119 |
|||
| 292 |
|||
| 79 |
|||
| rowspan="2" | 1 |
|||
| 48:8:4 |
|||
| 810 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 1620 |
|||
| 898<br />(1796) |
|||
| 3.24 |
|||
| 6.5 |
|||
| 512 1024 |
|||
| 14.4 |
|||
| 155.5 |
|||
| {{unk}} |
|||
| 30 |
|||
| rowspan="2" | OEM |
|||
|- |
|||
! rowspan="4" style="text-align:left;" | GeForce GT 630{{efn|name=geforce 600 6|The GeForce GT 630 (DDR3, 128-bit, retail) card is a rebranded GeForce GT 430 (DDR3, 128-bit).}}{{efn|name=geforce 600 7|The GeForce GT 630 (GDDR5) card is a rebranded GeForce GT 440 (GDDR5).}} |
|||
| April 24, 2012 |
|||
| GK107 |
|||
| TSMC [[28 nm]] |
|||
| 1300 |
|||
| 118 |
|||
| PCIe 3.0 x16 |
|||
| 192:16:16 |
|||
| 875 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 875 |
|||
| 891<br />(1782) |
|||
| 14 |
|||
| 14 |
|||
| 1024<br />2048 |
|||
| 28.5 |
|||
| rowspan="3" | 128 |
|||
| 1.2 |
|||
| 336 |
|||
| 14 |
|||
| 50 |
|||
|- |
|||
| rowspan="2" | May 15, 2012 |
|||
| GF108-400-A1 |
|||
| rowspan="2" | TSMC 40 nm |
|||
| rowspan="2" | 585 |
|||
| rowspan="2" | 116 |
|||
| rowspan="2" | PCIe 2.0 x16 |
|||
| rowspan="2" | 2 |
|||
| 96:16:4 |
|||
| 700 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 1620 |
|||
| 1600–1800 |
|||
| 2.8 |
|||
| 11.2 |
|||
| 1024<br />2048<br />4096 |
|||
| 25.6–28.8 |
|||
| rowspan="2" {{N/a}} |
|||
| 311 |
|||
| {{unk}} |
|||
| 49 |
|||
| rowspan="2" | Retail |
|||
|- |
|||
| GF108 |
|||
| 96:16:4 |
|||
| 810 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 1620 |
|||
| 800<br />(3200) |
|||
| 3.2 |
|||
| 13 |
|||
| 1024 |
|||
| 51.2 |
|||
| GDDR5 |
|||
| 311 |
|||
| {{unk}} |
|||
| 65 |
|||
|- |
|||
| May 29, 2013 |
|||
| GK208-301-A1 |
|||
| rowspan="2" | TSMC 28 nm |
|||
| rowspan="2" | 1020 |
|||
| rowspan="2" | 79 |
|||
| PCIe 2.0 x8 |
|||
| rowspan="2" | 1 |
|||
| 384:16:8 |
|||
| 902 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 902 |
|||
| 900<br />(1800) |
|||
| 7.22 |
|||
| 14.44 |
|||
| rowspan="2" | 1024<br />2048 |
|||
| 14.4 |
|||
| rowspan="5" | DDR3 |
|||
| rowspan="2" | 64 |
|||
| rowspan="2" | 1.2 |
|||
| 692.7 |
|||
| {{unk}} |
|||
| 25 |
|||
| |
|||
|- |
|||
! style="text-align:left;" | GeForce GT 635 |
|||
| February 19, 2013 |
|||
| GK208 |
|||
| PCIe 3.0 x8 |
|||
| 384:16:8 |
|||
| 967 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 967 |
|||
| 1001<br />(2002) |
|||
| 7.74 |
|||
| 15.5 |
|||
| 16 |
|||
| 742.7 |
|||
| {{unk}} |
|||
| 35 |
|||
| rowspan="3" | OEM |
|||
|- |
|||
! rowspan="5" style="text-align:left;" | GeForce GT 640{{efn|name=geforce 600 8|The GeForce GT 640 (OEM) GF116 card is a rebranded GeForce GT 545 (DDR3).}} |
|||
| rowspan="2" | April 24, 2012 |
|||
| GF116 |
|||
| TSMC 40 nm |
|||
| 1170 |
|||
| 238 |
|||
| PCIe 2.0 x16 |
|||
| 3 |
|||
| 144:24:24 |
|||
| 720 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 1440 |
|||
| 891<br />(1782) |
|||
| 17.3 |
|||
| 17.3 |
|||
| 1536<br />3072 |
|||
| 42.8 |
|||
| 192 |
|||
| {{N/a}} |
|||
| 414.7 |
|||
| {{unk}} |
|||
| 75 |
|||
|- |
|||
| rowspan="3" | GK107 |
|||
| rowspan="3" | TSMC 28 nm |
|||
| rowspan="3" | 1300 |
|||
| rowspan="3" | 118 |
|||
| rowspan="3" | PCIe 3.0 x16 |
|||
| rowspan="4" | 2 |
|||
| rowspan="3" | 384:32:16 |
|||
| 797 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 797 |
|||
| 891<br />(1782) |
|||
| 12.8 |
|||
| 25.5 |
|||
| 1024<br />2048 |
|||
| 28.5 |
|||
| rowspan="3" | 128 |
|||
| rowspan="4" | 1.2 |
|||
| 612.1 |
|||
| 25.50 |
|||
| 50 |
|||
|- |
|||
| June 5, 2012 |
|||
| 900 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 900 |
|||
| 891<br />(1782) |
|||
| 14.4 |
|||
| 28.8 |
|||
| 2048<br />4096 |
|||
| 28.5 |
|||
| 691.2 |
|||
| 28.8 |
|||
| 65 |
|||
| $100 |
|||
|- |
|||
| April 24, 2012 |
|||
| 950 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 950 |
|||
| 1250<br />(5000) |
|||
| 15.2 |
|||
| 30.4 |
|||
| 1024<br />2048 |
|||
| 80 |
|||
| rowspan="14" | GDDR5 |
|||
| 729.6 |
|||
| 30.40 |
|||
| 75 |
|||
| OEM |
|||
|- |
|||
| May 29, 2013 |
|||
| GK208-400-A1 |
|||
| TSMC 28 nm |
|||
| 1020 |
|||
| 79 |
|||
| PCIe 2.0 x8 |
|||
| 384:16:8 |
|||
| 1046 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 1046 |
|||
| 1252<br />(5008) |
|||
| 8.37 |
|||
| 16.7 |
|||
| rowspan="3" | 1024 |
|||
| 40.1 |
|||
| 64 |
|||
| 803.3 |
|||
| {{unk}} |
|||
| 49 |
|||
| |
|||
|- |
|||
! style="text-align:left;" | GeForce GT 645{{efn|name=geforce 600 9|The GeForce GT 645 (OEM) card is a rebranded GeForce GTX 560 SE.}} |
|||
| April 24, 2012 |
|||
| GF114-400-A1 |
|||
| TSMC 40 nm |
|||
| 1950 |
|||
| 332 |
|||
| PCIe 2.0 x16 |
|||
| 6 |
|||
| 288:48:24 |
|||
| 776 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 1552 |
|||
| 1914 |
|||
| 18.6 |
|||
| 37.3 |
|||
| 91.9 |
|||
| 192 |
|||
| {{N/a}} |
|||
| 894 |
|||
| {{unk}} |
|||
| 140 |
|||
| rowspan="2" | OEM |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 645 |
|||
| April 22, 2013 |
|||
| GK106 |
|||
| rowspan="11" | TSMC 28 nm |
|||
| 2540 |
|||
| 221 |
|||
| rowspan="11" | PCIe 3.0 x16 |
|||
| 3 |
|||
| 576:48:16 |
|||
| 823.5 |
|||
| 888.5 |
|||
| {{N/a}} |
|||
| 823 |
|||
| 1000<br />(4000) |
|||
| 14.16 |
|||
| 39.5 |
|||
| 64 |
|||
| rowspan="4" | 128 |
|||
| rowspan="11" | 1.2 |
|||
| 948.1 |
|||
| 39.53 |
|||
| rowspan="2" | 64 |
|||
|- |
|||
! rowspan="2" style="text-align:left;" | GeForce GTX 650 |
|||
| September 13, 2012 |
|||
| GK107-450-A2 |
|||
| 1300 |
|||
| 118 |
|||
| rowspan="2" | 2 |
|||
| rowspan="2" | 384:32:16 |
|||
| rowspan="2" | 1058 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| rowspan="2" | 1058 |
|||
| rowspan="2" | 1250<br />(5000) |
|||
| rowspan="2" | 16.9 |
|||
| rowspan="2" | 33.8 |
|||
| rowspan="4" | 1024<br />2048 |
|||
| rowspan="2" | 80 |
|||
| rowspan="2" | 812.54 |
|||
| rowspan="2" | 33.86 |
|||
| $110 |
|||
|- |
|||
| November 27, 2013 <ref>{{cite web|title=NVIDIA GeForce GTX 650 Specs|url=https://www.techpowerup.com/gpu-specs/geforce-gtx-650.c2445|access-date=2021-12-09|website=TechPowerUp|language=en}}</ref> |
|||
| GK-106-400-A1 |
|||
| rowspan="4" |2540 |
|||
| rowspan="4" |221 |
|||
| {{N/a}} |
|||
| 65 |
|||
| ? |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 650 Ti |
|||
| October 9, 2012 |
|||
| GK106-220-A1 |
|||
| rowspan="2" | 4 |
|||
| 768:64:16 |
|||
| 928 |
|||
| {{N/a}} |
|||
| {{N/a}} |
|||
| 928 |
|||
| 1350<br />(5400) |
|||
| 14.8 |
|||
| 59.4 |
|||
| 86.4 |
|||
| 1425.41 |
|||
| 59.39 |
|||
| 110 |
|||
| $150 (130) |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 650 Ti Boost |
|||
| March 26, 2013 |
|||
| GK106-240-A1 |
|||
| 768:64:24 |
|||
| rowspan="2" | 980 |
|||
| rowspan="2" | 1032 |
|||
| {{N/a}} |
|||
| rowspan="2" | 980 |
|||
| 1502<br />(6008) |
|||
| 23.5 |
|||
| 62.7 |
|||
| 144.2 |
|||
| 192 |
|||
| 1505.28 |
|||
| 62.72 |
|||
| 134 |
|||
| $170 (150) |
|||
|- |
|||
! rowspan="2" style="text-align:left;" | GeForce GTX 660 |
|||
| September 13, 2012 |
|||
| GK106-400-A1 |
|||
| 5 |
|||
| 960:80:24 |
|||
| 1084 |
|||
| 1502<br />(6008) |
|||
| 23.5 |
|||
| 78.4 |
|||
| 1536+512<br />3072 |
|||
| 96.1+48.1<br />144.2 |
|||
| 128+64<br />192 |
|||
| 1881.6 |
|||
| 78.40 |
|||
| 140 |
|||
| $230 (180) |
|||
|- |
|||
| August 22, 2012 |
|||
| GK104-200-KD-A2 |
|||
| rowspan="4" | 3540 |
|||
| rowspan="4" | 294 |
|||
| 6 |
|||
| 1152:96:24<br />1152:96:32 |
|||
| 823.5 |
|||
| 888.5 |
|||
| 899 |
|||
| 823 |
|||
| 1450<br />(5800) |
|||
| 19.8 |
|||
| 79 |
|||
| 1536<br />2048<br />3072 |
|||
| 134 |
|||
| 192<br />256 |
|||
| 2108.6 |
|||
| 79.06 |
|||
| 130 |
|||
| OEM |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 660 Ti |
|||
| August 16, 2012 |
|||
| GK104-300-KD-A2 |
|||
| rowspan="2" | 7 |
|||
| 1344:112:24 |
|||
| rowspan="2" | 915 |
|||
| rowspan="2" | 980 |
|||
| 1058 |
|||
| rowspan="2" | 915 |
|||
| 1502<br />(6008) |
|||
| 22.0 |
|||
| 102.5 |
|||
| 2048 |
|||
| 96.1+48.1<br />144.2 |
|||
| 128+64<br />192 |
|||
| 2459.52 |
|||
| 102.48 |
|||
| 150 |
|||
| $300 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 670 |
|||
| May 10, 2012 |
|||
| GK104-325-A2 |
|||
| 1344:112:32 |
|||
| 1084 |
|||
| 1502<br />(6008) |
|||
| 29.3 |
|||
| 102.5 |
|||
| rowspan="2" | 2048<br />4096 |
|||
| 192.256 |
|||
| rowspan="2" | 256 |
|||
| 2459.52 |
|||
| 102.48 |
|||
| 170 |
|||
| $400 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 680 |
|||
| March 22, 2012 |
|||
| GK104-400-A2 |
|||
| 8 |
|||
| 1536:128:32 |
|||
| 1006<ref name="gtx680-nvidia-paper">{{cite web |url= http://www.geforce.com/Active/en_US/en_US/pdf/GeForce-GTX-680-Whitepaper-FINAL.pdf |title= Nvidia GeForce GTX 680 Whitepaper.pdf |url-status= dead |archive-url= https://web.archive.org/web/20120417045615/http://www.geforce.com/Active/en_US/en_US/pdf/GeForce-GTX-680-Whitepaper-FINAL.pdf |archive-date= April 17, 2012 |df= mdy-all }} {{small|( 1405KB)}}, page 6 of 29</ref> |
|||
| 1058 |
|||
| 1110 |
|||
| 1006 |
|||
| 1502<br />(6008) |
|||
| 32.2 |
|||
| 128.8 |
|||
| 192.256 |
|||
| 3090.43 |
|||
| 128.77 |
|||
| 195 |
|||
| $500 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 690 |
|||
| April 29, 2012 |
|||
| 2x GK104-355-A2 |
|||
| 2x 3540 |
|||
| 2x 294 |
|||
| 2x 8 |
|||
| 2x 1536:128:32 |
|||
| 915 |
|||
| 1019 |
|||
| 1058 |
|||
| 915 |
|||
| 1502<br />(6008) |
|||
| 2x 29.28 |
|||
| 2x 117.12 |
|||
| 2x 2048 |
|||
| 2x 192.256 |
|||
| 2x 256 |
|||
| 2x 2810.88 |
|||
| 2x 117.12 |
|||
| 300 |
|||
| $1000 |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Fab ([[Nanometer|nm]]) |
|||
! rowspan="2" | Transistors (million) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | [[Computer bus|Bus]] [[I/O interface|interface]] |
|||
! rowspan="2" | SM count |
|||
! rowspan="2" | Core config{{efn|name=geforce 600 1|[[Unified shader model|Unified shaders]]: [[texture mapping unit]]s: [[render output unit]]s}} |
|||
! colspan="5" | Clock rate |
|||
! colspan="2" | [[Fillrate]] |
|||
! colspan="4" | Memory configuration |
|||
! colspan="4" | Supported [[Application programming interface|API]] version |
|||
! colspan="2" | Processing power ([[GFLOPS]]){{efn|name=geforce 600 10|To calculate the processing power see [[Kepler (microarchitecture)#Performance]], or [[Fermi (microarchitecture)#Performance]].}} |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | Release Price (USD) |
|||
|- |
|||
! Core ([[Hertz|MHz]]) |
|||
! Average Boost ([[Hertz|MHz]]) |
|||
! Max Boost ([[Hertz|MHz]]) |
|||
! Shader ([[Hertz|MHz]]) |
|||
! Memory ([[Hertz|MHz]]) |
|||
! Pixel ([[Pixel|GP]]/s) |
|||
! Texture ([[Texel (graphics)|GT]]/s) |
|||
! Size ([[Megabyte|MB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! DRAM type |
|||
! Bus width ([[bit]]) |
|||
! [[Vulkan (API)|Vulkan]] |
|||
! [[Direct3D]] |
|||
! [[OpenGL]] |
|||
! [[OpenCL]] |
|||
! [[Single precision floating-point format|Single precision]] |
|||
! [[Double precision floating-point format|Double precision]] |
|||
|} |
|||
{{notelist}} |
|||
=== GeForce 700 series === |
|||
{{Further|GeForce 700 series|Kepler (microarchitecture)}} |
|||
The GeForce 700 series for desktop. The GM107-chips are [[Maxwell (microarchitecture)|Maxwell]]-based, the GKxxx-chips [[Kepler (microarchitecture)|Kepler]]. |
|||
* Improve [[Nvidia NVENC|NVENC]] |
|||
{| class="wikitable" style="font-size: 80%; text-align: center;" |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Fab ([[Nanometer|nm]]) |
|||
! rowspan="2" | Transistors (million) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | [[Computer bus|Bus]] [[I/O interface|interface]] |
|||
! rowspan="2" | SMX count |
|||
! rowspan="2" | Core config{{efn|name=geforce 700 1|[[Unified shader model|Unified shaders]]: [[texture mapping unit]]s: [[render output unit]]s}} |
|||
! colspan="4" | Clock rate |
|||
! colspan="2" | [[Fillrate]] |
|||
! colspan="4" | Memory configuration |
|||
! colspan="4" | Supported [[Application programming interface|API]] version |
|||
! colspan="2" | Processing power ([[GFLOPS]]){{efn|name=geforce 700 9|To calculate the processing power see [[Maxwell (microarchitecture)#Performance]], or [[Kepler (microarchitecture)#Performance]].}} |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | Release Price (USD) |
|||
|- |
|||
! Base ([[Hertz|MHz]]) |
|||
! Average Boost ([[Hertz|MHz]]) |
|||
! Max Boost{{efn|name=geforce 700 2|Max Boost depends on ASIC quality. For example, some GTX TITAN with over 80% ASIC quality can hit 1019 MHz by default, lower ASIC quality will be 1006 MHz or 993 MHz.}} ([[Hertz|MHz]]) |
|||
! Memory ([[Hertz|MHz]]) |
|||
! Pixel ([[Pixel|GP]]/s) |
|||
! Texture ([[Texel (graphics)|GT]]/s) |
|||
! Size ([[Megabyte|MB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! DRAM type |
|||
! Bus width ([[bit]]) |
|||
! [[Vulkan (API)|Vulkan]]{{efn|name=geforce 700 11|Maxwell supports Vulkan version 1.3, while Kepler only support Vulkan version 1.2, Fermi does not support the Vulkan API at all.<ref name="vulkandrv" />}} |
|||
! [[Direct3D]]{{efn|name=geforce 700 3|Kepler supports some optional 11.1 features on [[Direct3D feature level|feature level]] 11_0 through the Direct3D 11.1 API, however Nvidia did not enable four non-gaming features to qualify Kepler for level 11_1.<ref>{{cite web |url=http://www.guru3d.com/news_story/nvidia_kepler_not_fully_compliant_with_directx_11_1.html |title=Nvidia Kepler not fully compliant with Direct3D 11.1 |website=Guru3d.com |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151222113454/http://www.guru3d.com/news_story/nvidia_kepler_not_fully_compliant_with_directx_11_1.html |archive-date=2015-12-22 |url-status=live }}</ref><ref>[http://www.brightsideofnews.com/news/2012/11/21/nvidia-doesnt-fully-support-Direct3D-111-with-kepler-gpus2c-bute280a6.aspx Nvidia Doesn't Fully Support DirectX 11.1 with Kepler GPUs, But... - BrightSideOfNews.com] {{webarchive |url=https://web.archive.org/web/20130903174514/http://www.brightsideofnews.com/news/2012/11/21/nvidia-doesnt-fully-support-directx-111-with-kepler-gpus2c-bute280a6.aspx |date=September 3, 2013 }}</ref>}} |
|||
! [[OpenGL]] |
|||
! [[OpenCL]] |
|||
! [[Single precision floating-point format|Single precision]] |
|||
! [[Double precision floating-point format|Double precision]] |
|||
|- |
|||
! style="text-align:left;" | GeForce GT 705<ref>{{cite web |url=http://www.techpowerup.com/gpudb/2578/geforce-gt-705.html |title=Nvidia GeForce GT 705 {{pipe}} techPowerUp GPU Database |website=Techpowerup.com |access-date=2015-12-11 |archive-url=https://archive.today/20140615091732/http://www.techpowerup.com/gpudb/2578/geforce-gt-705.html |archive-date=2014-06-15 |url-status=live }}</ref>{{efn|name=geforce 700 4|The GeForce GT 705 (OEM) is a rebranded GeForce GT 610, which itself is a rebranded GeForce GT 520.}} |
|||
| rowspan="2" | March 27, 2014 |
|||
| GF119-300-A1 |
|||
| [[TSMC]] 40 nm |
|||
| 292 |
|||
| 79 |
|||
| PCIe 2.0 x16 |
|||
| rowspan="4" | 1 |
|||
| 48:8:4 |
|||
| 810 |
|||
| {{n/a}} |
|||
| {{n/a}} |
|||
| 898<br />(1796) |
|||
| 3.24 |
|||
| 6.5 |
|||
| 512<br />1024 |
|||
| rowspan="2" | 14.4 |
|||
| rowspan="2" | DDR3 |
|||
| 64 |
|||
| n/a |
|||
| rowspan="21" | 12 |
|||
| rowspan="20" | 4.6 |
|||
| 1.1 |
|||
| 155.5 |
|||
| 19.4 |
|||
| 29 |
|||
| rowspan="2" | OEM |
|||
|- |
|||
! rowspan="2" style="text-align:left;" | GeForce GT 710<ref>{{cite web |url=http://www.techpowerup.com/gpudb/1990/geforce-gt-710.html |title=Nvidia GeForce GT 710 {{pipe}} techPowerUp GPU Database |website=Techpowerup.com |access-date=2015-12-11 |archive-url=https://archive.today/20140615091722/http://www.techpowerup.com/gpudb/1990/geforce-gt-710.html |archive-date=2014-06-15 |url-status=live }}</ref> |
|||
| GK208-301-A1 |
|||
| rowspan="5" | TSMC [[28 nm]] |
|||
| rowspan="5" | 1020 |
|||
| rowspan="5" | 79 |
|||
| PCIe 2.0 x8 |
|||
| 192:16:8 |
|||
| 823 |
|||
| {{n/a}} |
|||
| {{n/a}} |
|||
| 900 (1800) |
|||
| 6.6 |
|||
| 13.2 |
|||
| 512 |
|||
| rowspan="5" | 64 |
|||
| rowspan="5" | 1.2 |
|||
| rowspan="5" | 1.2 |
|||
| 316.0 |
|||
| 13.2 |
|||
| |
|||
|- |
|||
| January 26, 2016 |
|||
| GK208-203-B1 |
|||
| PCIe 2.0 x8, PCIe x1 |
|||
| 192:16:8 |
|||
| 954 |
|||
| {{n/a}} |
|||
| {{n/a}} |
|||
| 900 (1800)<br />1253 (5010) |
|||
| 7.6 |
|||
| 15.3 |
|||
| 1024<br />2048 |
|||
| 14.4<br />40.0 |
|||
| rowspan="2" | DDR3<br />GDDR5 |
|||
| 366 |
|||
| 15.3 |
|||
| rowspan="2" | 19 |
|||
| $35–45 |
|||
|- |
|||
! style="text-align:left;" | GeForce GT 720<ref>{{cite web |url=http://www.techpowerup.com/gpudb/1989/geforce-gt-720.html |title=Nvidia GeForce GT 720 {{pipe}} techPowerUp GPU Database |website=Techpowerup.com |access-date=2015-12-11 |archive-url=https://archive.today/20140615091724/http://www.techpowerup.com/gpudb/1989/geforce-gt-720.html |archive-date=2014-06-15 |url-status=live }}</ref> |
|||
| March 27, 2014 |
|||
| GK208-201-B1 |
|||
| rowspan="3" | PCIe 2.0 x8 |
|||
| 192:16:8 |
|||
| 797 |
|||
| {{n/a}} |
|||
| {{n/a}} |
|||
| 900 (1800)<br />1253 (5010) |
|||
| 6.4 |
|||
| 12.8 |
|||
| 1024<br />2048 |
|||
| 14.4<br />40.0 |
|||
| 306 |
|||
| 12.8 |
|||
| $49–59 |
|||
|- |
|||
! rowspan="3" style="text-align:left;" | GeForce GT 730<br /><ref name="gt730">{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gt-730/specifications |title=GT 730 {{pipe}} Specifications |website=GeForce.com |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151212232644/http://www.geforce.com/hardware/desktop-gpus/geforce-gt-730/specifications |archive-date=2015-12-12 |url-status=live }}</ref>{{efn|name=geforce 700 5|The GeForce GT 730 (DDR3, 64-bit) is a rebranded GeForce GT 630 (Rev. 2).}}{{efn|name=geforce 700 6|The GeForce GT 730 (DDR3, 128-bit) is a rebranded GeForce GT 630 (128-bit).}} |
|||
| rowspan="3" | June 18, 2014 |
|||
| GK208-301-A1 |
|||
| rowspan="5" | 2 |
|||
| 384:16:8 |
|||
| 902 |
|||
| {{n/a}} |
|||
| {{n/a}} |
|||
| 900<br />(1800) |
|||
| 7.22 |
|||
| 14.44 |
|||
| 1024<ref name="1GBgt730">{{cite web|url=https://www.zotac.com/product/graphics_card/gt-730-1gb|title=GeForce GT 730 1GB-ZOTAC|work=[[ZOTAC]]|access-date=July 12, 2017|archive-url=https://web.archive.org/web/20160719212714/https://www.zotac.com/product/graphics_card/gt-730-1gb|archive-date=July 19, 2016|url-status=live}}</ref><br />2048<br />4096 |
|||
| 14.4 |
|||
| DDR3 |
|||
| rowspan="2" | 692.7 |
|||
| rowspan="2" | 28.9 |
|||
| 23 |
|||
| rowspan="3" | $69–79 |
|||
|- |
|||
| GK208-400-A1 |
|||
| 384:16:8 |
|||
| 902 |
|||
| {{n/a}} |
|||
| {{n/a}} |
|||
| 1250<br />(5000) |
|||
| 7.22 |
|||
| 14.44 |
|||
| 1024<br />2048<ref>{{cite web |url=http://www.evga.com/products/product.aspx?pn=02G-P3-3733-KR |title=EVGA - Products - EVGA GeForce GT 730 2GB (Low Profile) - 02G-P3-3733-KR |access-date=2017-04-29 |archive-url=https://web.archive.org/web/20170215122047/http://www.evga.com/Products/Product.aspx?pn=02G-P3-3733-KR |archive-date=2017-02-15 |url-status=live }}</ref> |
|||
| 40.0 |
|||
| GDDR5 |
|||
| 25 |
|||
|- |
|||
| GF108 |
|||
| TSMC 40 nm |
|||
| 585 |
|||
| 116 |
|||
| PCIe 2.0 x16 |
|||
| 96:16:4 |
|||
| 700 |
|||
| {{n/a}} |
|||
| {{n/a}} |
|||
| 900<br />(1800) |
|||
| 2.8 |
|||
| 11.0 |
|||
| rowspan="3" | 1024<br />2048<br />4096 |
|||
| 28.8 |
|||
| rowspan="2" | DDR3 |
|||
| 128 |
|||
| n/a |
|||
| 1.1 |
|||
| 268.8 |
|||
| 33.6 |
|||
| 49 |
|||
|- |
|||
! rowspan="2" style="text-align:left;" | GeForce GT 740{{efn|name=geforce 700 7|The GeForce GT 740 (OEM) is a rebranded GeForce GTX 650.}} |
|||
| rowspan="2" | May 29, 2014 |
|||
| rowspan="2" | GK107-425-A2 |
|||
| rowspan="14" | [[TSMC]]<br />[[28 nanometer|28HP]] |
|||
| rowspan="2" | 1270 |
|||
| rowspan="2" | 118 |
|||
| rowspan="14" | PCIe 3.0 x16 |
|||
| 384:32:16 |
|||
| 993 |
|||
| {{n/a}} |
|||
| {{n/a}} |
|||
| 891<br />(1782) |
|||
| 15.9 |
|||
| 31.8 |
|||
| 28.5 |
|||
| rowspan="5" | 128 |
|||
| rowspan="2" | 1.2 |
|||
| rowspan="14" | 1.2 |
|||
| rowspan="2" | 762.6 |
|||
| rowspan="2" | 31.8 |
|||
| rowspan="2" | 64 |
|||
| rowspan="2" | $89–99 |
|||
|- |
|||
| 384:32:16 |
|||
| 993 |
|||
| {{n/a}} |
|||
| {{n/a}} |
|||
| 1252<br />(5008) |
|||
| 15.9 |
|||
| 31.8 |
|||
| 80.1 |
|||
| GDDR5 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 745 |
|||
| rowspan="3" | February 18, 2014 |
|||
| GM107-220-A2 |
|||
| rowspan="3" | 1870 |
|||
| rowspan="3" | 148 |
|||
| 3 |
|||
| 384:24:16 |
|||
| 1033 |
|||
| {{unk}} |
|||
| {{unk}} |
|||
| 900<br />(1800) |
|||
| 16.5 |
|||
| 24.8 |
|||
| 1024<br />4096 |
|||
| 28.8 |
|||
| DDR3 |
|||
| rowspan="3" | 1.3 |
|||
| 793.3 |
|||
| 24.8 |
|||
| rowspan="2" | 55 |
|||
| OEM |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 750 |
|||
| GM107-300-A2 |
|||
| 4 |
|||
| 512:32:16 |
|||
| 1020 |
|||
| 1085 |
|||
| 1163 |
|||
| 1250<br />(5000) |
|||
| 16.3 |
|||
| 32.6 |
|||
| 1024<br />2048<br />4096<ref>{{cite web|title=AFOX GTX 750 4 GB Specs|url=https://www.techpowerup.com/gpu-specs/afox-gtx-750-4-gb.b9074|access-date=2021-08-11|website=TechPowerUp|language=en}}</ref> |
|||
| 80 |
|||
| rowspan="11" | GDDR5 |
|||
| 1044.5 |
|||
| 32.6 |
|||
| $119 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 750 Ti |
|||
| GM107-400-A2 |
|||
| 5 |
|||
| 640:40:16 |
|||
| 1020 |
|||
| 1085 |
|||
| 1200 |
|||
| 1350<br />(5400) |
|||
| 16.3 |
|||
| 40.8 |
|||
| 1024<br />2048<br />4096 |
|||
| 86.4 |
|||
| 1305.6 |
|||
| 40.8 |
|||
| 60 |
|||
| $149 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 760 192-bit |
|||
| October 17, 2013 |
|||
| GK104-200-KD-A2 |
|||
| rowspan="4" | 3540 |
|||
| rowspan="4" | 294 |
|||
| rowspan="2" | 6 |
|||
| 1152:96:24 |
|||
| 824 |
|||
| 888 |
|||
| 889 |
|||
| 1450<br />(5800) |
|||
| 19.8 |
|||
| 79.1 |
|||
| 1536<br />3072 |
|||
| 134.4 |
|||
| 192 |
|||
| rowspan="9" | 1.2 |
|||
| 1896.2 |
|||
| 79.0 |
|||
| 130 |
|||
| OEM |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 760 |
|||
| June 25, 2013 |
|||
| GK104-225-A2 |
|||
| 1152:96:32 |
|||
| 980 |
|||
| 1033 |
|||
| 1124 |
|||
| 1502<br />(6008) |
|||
| 31.4{{efn|name=geforce 700 10|As a Kepler GPC is able to rasterize 8 pixels per clock, fully enabled GK110 GPUs (780 Ti/TITAN Black) can only output 40 pixels per clock (5 GPCs), despite 48 ROPs and all SMX units being physically present. For GTX 780 and GTX 760, multiple GPC configurations with differing pixel fillrate are possible, depending on which SMXs were disabled in the chip: 5/4 GPCs, or 4/3 GPCs, respectively.}} |
|||
| 94 |
|||
| 2048<br />4096 |
|||
| 192.3 |
|||
| rowspan="3" | 256 |
|||
| 2257.9 |
|||
| 94.1 |
|||
| rowspan="2" | 170 |
|||
| $249 ($219) |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 760 Ti{{efn|name=geforce 700 8|The GeForce GTX 760 Ti (OEM) is a rebranded GeForce GTX 670.}} |
|||
| September 27, 2013<ref>{{cite web |url=https://www.techpowerup.com/gpudb/2491/geforce-gtx-760-ti-oem |title=NVIDIA GeForce GTX 760 Ti OEM Specs |access-date=2018-08-06 |archive-url=https://archive.today/20160530101755/http://www.techpowerup.com/gpudb/2491/geforce-gtx-760-ti-oem |archive-date=2016-05-30 |url-status=live }}</ref> |
|||
| GK104 |
|||
| 7 |
|||
| 1344:112:32 |
|||
| 915 |
|||
| 980 |
|||
| 1084 |
|||
| 1502<br />(6008) |
|||
| 29.3 |
|||
| 102.5 |
|||
| 2048 |
|||
| 192.3 |
|||
| 2459.5 |
|||
| 102.5 |
|||
| OEM |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 770 |
|||
| May 30, 2013 |
|||
| GK104-425-A2 |
|||
| 8 |
|||
| 1536:128:32 |
|||
| 1046 |
|||
| 1085 |
|||
| 1130 |
|||
| 1752.5<br />(7010) |
|||
| 33.5 |
|||
| 134 |
|||
| 2048 4096 |
|||
| 224 |
|||
| 3213.3 |
|||
| 133.9 |
|||
| rowspan="5" | 230 |
|||
| $399 ($329) |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 780 |
|||
| May 23, 2013 |
|||
| GK110-300-A1 |
|||
| rowspan="4" | 7080 |
|||
| rowspan="4" | 561 |
|||
| 12 |
|||
| 2304:192:48 |
|||
| 863 |
|||
| 900 |
|||
| 1002 |
|||
| 1502<br />(6008) |
|||
| 41.4{{efn|name=geforce 700 10|As a Kepler GPC is able to rasterize 8 pixels per clock, fully enabled GK110 GPUs (780 Ti/TITAN Black) can only output 40 pixels per clock (5 GPCs), despite 48 ROPs and all SMX units being physically present. For GTX 780 and GTX 760, multiple GPC configurations with differing pixel fillrate are possible, depending on which SMXs were disabled in the chip: 5/4 GPCs, or 4/3 GPCs, respectively.}} |
|||
| 160.5 |
|||
| 3072 6144<ref>{{cite web |url=http://www.evga.com/articles/00830/ |title=Articles - EVGA GeForce GTX 780 6GB Step-Up Available Now! |publisher=EVGA |date=2014-03-21 |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20160113203734/http://www.evga.com/articles/00830/ |archive-date=2016-01-13 |url-status=live }}</ref> |
|||
| 288.4 |
|||
| rowspan="4" | 384 |
|||
| 3976.7 |
|||
| 165.7 |
|||
| $649 ($499) |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 780 Ti<ref>{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-780-ti/specifications |title=GeForce GTX780 Ti. Specifications |website=Geforce.com |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151212021141/http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-780-ti/specifications |archive-date=2015-12-12 |url-status=live }}</ref><ref>{{cite web |url=http://videocardz.com/47508/videocardz-nvidia-geforce-gtx-780-ti-2880-cuda-cores |title=Nvidia GeForce GTX 780 Ti has 2880 CUDA cores |website=Videocardz.com |date=31 October 2013 |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151222120538/http://videocardz.com/47508/videocardz-nvidia-geforce-gtx-780-ti-2880-cuda-cores |archive-date=2015-12-22 |url-status=live }}</ref><ref>{{cite web |url=http://web-engage.augure.com/pub/link/282593/04601926874847631383752919307-hl-com.com.html |title=PNY dévoile son nouveau foudre de guerre: la GeForce GTX 780 TI. |website=Web-engage.augure.com |access-date=2015-12-11 |url-status=dead |archive-url=https://web.archive.org/web/20131109211440/http://web-engage.augure.com/pub/link/282593/04601926874847631383752919307-hl-com.com.html |archive-date=November 9, 2013 }}</ref> |
|||
| November 7, 2013 |
|||
| GK110-425-B1 |
|||
| 15 |
|||
| 2880:240:48 |
|||
| 876 |
|||
| 928 |
|||
| 1019 |
|||
| 1752.5<br />(7010) |
|||
| 42.0{{efn|name=geforce 700 10|As a Kepler GPC is able to rasterize 8 pixels per clock, fully enabled GK110 GPUs (780 Ti/TITAN Black) can only output 40 pixels per clock (5 GPCs), despite 48 ROPs and all SMX units being physically present. For GTX 780 and GTX 760, multiple GPC configurations with differing pixel fillrate are possible, depending on which SMXs were disabled in the chip: 5/4 GPCs, or 4/3 GPCs, respectively.}} |
|||
| 210.2 |
|||
| 3072 |
|||
| 336.5 |
|||
| 5045.7 |
|||
| 210.2 |
|||
| $699 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX TITAN<ref>{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan |title=GeForce GTX TITAN |website=Geforce.com |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151205173714/http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan |archive-date=2015-12-05 |url-status=live }}</ref><ref>{{cite web |url=http://www.nvidia.com/titan-graphics-card |title=TITAN Graphics Card |website=Nvidia.com |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20130224082627/http://www.nvidia.com/titan-graphics-card |archive-date=2013-02-24 |url-status=live }}</ref><ref>{{cite web |url=http://www.anandtech.com/show/6760/nvidias-geforce-gtx-titan-part-1 |title=Nvidia's GeForce GTX Titan, Part 1: Titan For Gaming, Titan For Compute |website=Anandtech.com |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151204225432/http://www.anandtech.com/show/6760/nvidias-geforce-gtx-titan-part-1 |archive-date=2015-12-04 |url-status=live}}</ref> |
|||
| February 21, 2013 |
|||
| GK110-400-A1 |
|||
| 14 |
|||
| 2688:224:48 |
|||
| 837 |
|||
| 876 |
|||
| 993 |
|||
| 1502<br />(6008) |
|||
| 40.2 |
|||
| 187.5 |
|||
| rowspan="2" | 6144 |
|||
| 288.4 |
|||
| 4499.7 |
|||
| 1300<ref>{{cite web |url=http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/3 |title=Titan's Compute Performance (aka Ph.D Lust) - Nvidia's GeForce GTX Titan Review, Part 2: Titan's Performance Unveiled |website=Anandtech.com |access-date=2015-12-11 |quote=the calculated fp64 peak of Titan is 1.5 TFlops. However, under heavy load in fp64 mode, the card may underclock below the listed 837MHz to remain within the power and thermal specifications |archive-url=https://web.archive.org/web/20151222141607/http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/3 |archive-date=2015-12-22 |url-status=live }}</ref>-1499.9 |
|||
| rowspan="2" | $999 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX TITAN Black |
|||
| February 18, 2014 |
|||
| GK110-430-B1 |
|||
| 15 |
|||
| 2880:240:48 |
|||
| 889 |
|||
| 980 |
|||
| 1058 |
|||
| 1752.5<br />(7010) |
|||
| 42.7 |
|||
| 213.4 |
|||
| 336.5 |
|||
| 5120.6 |
|||
| 1706.9 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX TITAN Z |
|||
| May 28, 2014 |
|||
| 2x GK110-350-B1<ref>{{cite web|url=https://diy.pconline.com.cn/488/4883359_all.html|title=售价21999元!NV旗舰GTX TITAN Z评测-太平洋电脑网|date=5 June 2014|access-date=2020-08-16}}</ref> |
|||
| 2x 7080 |
|||
| 2x 561 |
|||
| 2x 15 |
|||
| 2x 2880:240:48 |
|||
| 705 |
|||
| 876 |
|||
| {{unk}} |
|||
| 1752.5<br />(7010) |
|||
| 2x 33.8 |
|||
| 2x 169 |
|||
| 2x 6144 |
|||
| 2x 336.5 |
|||
| 2x 384 |
|||
| 4.5 |
|||
| 5046x2 |
|||
| 1682x2<ref>{{cite web|url=https://www.techpowerup.com/gpu-specs/geforce-gtx-titan-z.c2575|title=NVIDIA GeForce GTX TITAN Z Specs|website=TechPowerUp|language=en|access-date=2020-02-26}}</ref> |
|||
| 375 |
|||
| $2999 |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Fab ([[Nanometer|nm]]) |
|||
! rowspan="2" | Transistors (million) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | [[Computer bus|Bus]] [[I/O interface|interface]] |
|||
! rowspan="2" | SMX count |
|||
! rowspan="2" | Core config{{efn|name=geforce 700 1|[[Unified shader model|Unified shaders]]: [[texture mapping unit]]s: [[render output unit]]s}} |
|||
! colspan="4" | Clock rate |
|||
! colspan="2" | [[Fillrate]] |
|||
! colspan="4" | Memory configuration |
|||
! colspan="4" | Supported [[Application programming interface|API]] version |
|||
! colspan="2" | Processing power ([[GFLOPS]]){{efn|name=geforce 700 8|The GeForce GTX 760 Ti (OEM) is a rebranded GeForce GTX 670.}} |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | Release Price (USD) |
|||
|- |
|||
! Base ([[Hertz|MHz]]) |
|||
! Average Boost ([[Hertz|MHz]]) |
|||
! Max Boost{{efn|name=geforce 700 2|Max Boost depends on ASIC quality. For example, some GTX TITAN with over 80% ASIC quality can hit 1019 MHz by default, lower ASIC quality will be 1006 MHz or 993 MHz.}} ([[Hertz|MHz]]) |
|||
! Memory ([[Hertz|MHz]]) |
|||
! Pixel ([[Pixel|GP]]/s) |
|||
! Texture ([[Texel (graphics)|GT]]/s) |
|||
! Size ([[Megabyte|MB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! DRAM type |
|||
! Bus width ([[bit]]) |
|||
! [[Vulkan (API)|Vulkan]] |
|||
! [[Direct3D]]{{efn|name=geforce 700 3}} |
|||
! [[OpenGL]] |
|||
! [[OpenCL]] |
|||
! [[Single precision floating-point format|Single precision]] |
|||
! [[Double precision floating-point format|Double precision]] |
|||
|} |
|||
{{notelist}} |
|||
===GeForce 900 series=== |
|||
{{Further|GeForce 900 series|Maxwell (microarchitecture)}} |
|||
* All models support the following [[Application programming interface|API]]s: [[Direct3D]] 12_1, [[OpenGL]] 4.6, [[OpenCL]] 3.0 and [[Vulkan (API)|Vulkan]] 1.3<ref name="vulkandrv">{{cite web | url=https://www.khronos.org/conformance/adopters/conformant-products | title=The Khronos Group | date=31 May 2022 }}</ref> and [[CUDA]] 5.2 |
|||
*Improve [[Nvidia NVENC|NVENC]] (YUV4:4:4, predictive lossless encoding). |
|||
*Add [[High Efficiency Video Coding|H265]] hardware support on GM20x |
|||
*GM108 does not have [[NVENC]] hardware encoder support. |
|||
{| class="wikitable" style="font-size: 80%; text-align: center;" |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Process |
|||
! rowspan="2" | Transistors (billion) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | Core config{{efn|name=CoreConfig}} |
|||
! rowspan="2" | [[Computer bus|Bus]] [[I/O interface|interface]] |
|||
! rowspan="2" |[[GPU cache|L2 Cache]]<br />([[Megabyte|MB]]) |
|||
! colspan="3" | Clock Speeds |
|||
! colspan="4" | Memory |
|||
! colspan="2" | [[Fillrate]]{{efn|name=PerfValues}} |
|||
! colspan="2" | Processing power ([[GFLOPS]]){{efn|name=PerfValues}}{{efn|name=ProcessingPower}} |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | [[Scalable Link Interface#|SLI]] support |
|||
! Release price (USD) |
|||
|- |
|||
! Base ([[Hertz|MHz]]) |
|||
! Boost ([[Hertz|MHz]]) |
|||
! Memory ([[Transfer (computing)|MT/s]]) |
|||
! Size ([[Gigabyte|GB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! Bus type |
|||
! Bus width ([[bit]]) |
|||
! Pixel ([[Pixel|GP]]/s){{efn|name=PixelFillrate}} |
|||
! Texture ([[Texel (graphics)|GT]]/s){{efn|name=TextureFillrate}} |
|||
! [[Single precision floating-point format|Single precision]] |
|||
! [[Double precision floating-point format|Double precision]] |
|||
! MSRP |
|||
|- |
|||
! style="text-align:left;" | GeForce GT 945A<ref>{{cite web |url=https://support.hp.com/at-de/document/c04996577 |title=Sprout Pro by HP |publisher=[[Hewlett-Packard|HP]] |access-date=2019-01-09 |archive-url=https://web.archive.org/web/20190109111211/https://support.hp.com/at-de/document/c04996577 |archive-date=2019-01-09 |url-status=live }}</ref><ref>{{cite web |url=https://devtalk.nvidia.com/default/topic/915766/linux-solaris-and-freebsd-driver-361-28-long-lived-branch-release-/ |title=Linux, Solaris, and FreeBSD driver 361.28 (long-lived branch release) |publisher=Nvidia |date=2016-02-09 |access-date=2016-02-10 |archive-url=https://web.archive.org/web/20160216022209/https://devtalk.nvidia.com/default/topic/915766/linux-solaris-and-freebsd-driver-361-28-long-lived-branch-release-/ |archive-date=2016-02-16 |url-status=live }}</ref><ref>{{cite web |url=https://www.techpowerup.com/gpudb/2813/geforce-945a |title=NVIDIA GeForce 945A Specs |access-date=2018-08-06 }}{{dead link|date=June 2022|bot=medic}}{{cbignore|bot=medic}}</ref> |
|||
| February, 2016 |
|||
| GM108 |
|||
| rowspan="9" | [[TSMC]]<br />[[32 nm process|28HP]] |
|||
| {{unk}} |
|||
| {{unk}} |
|||
| 512:24:8 (4) |
|||
| PCIe 3.0 x8 |
|||
| ? |
|||
| 1072 |
|||
| 1176 |
|||
| 1800 |
|||
| 1 / 2 |
|||
| 14.4 |
|||
| [[DDR3]] / [[GDDR5]] |
|||
| 64 |
|||
| 8.5<br />9.4 |
|||
| 25.7<br />28.2 |
|||
| 1,097.7<br />1,204.2 |
|||
| 34.3<br />37.6 |
|||
| 33 |
|||
| rowspan="1" {{No}} |
|||
| {{okay|OEM}} |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 950<ref>{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-950/specifications |title=GTX 950 {{pipe}} Specifications |publisher=GeForce |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151212232816/http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-950/specifications |archive-date=2015-12-12 |url-status=live }}</ref> |
|||
| August 20, 2015 |
|||
| GM206-250 |
|||
| rowspan="3" | 2.94 |
|||
| rowspan="3" | 227 |
|||
| 768:48:32 (6) |
|||
| rowspan="8" | PCIe 3.0 x16 |
|||
| rowspan="4" | 1 |
|||
| 1024 |
|||
| 1188 |
|||
| 6600 |
|||
| rowspan="2" | 2 |
|||
| 105.7 |
|||
| rowspan="8" | [[GDDR5]] |
|||
| rowspan="3" | 128 |
|||
| 32.7<br />38.0 |
|||
| 49.1<br />57.0 |
|||
| 1,572.8<br />1,824.7 |
|||
| 49.1<br />57.0 |
|||
| 90 (75{{efn|name=GTX950NPC|Some GTX950 cards were released without power connector powered only by PCIe slot. These had limited power consumption and TPD to 75W.<ref>{{cite web |title=GIGABYTE Adds 75W GeForce GTX 950 to Lineup |url=https://www.anandtech.com/show/10250/gigabyte-adds-geforce-gtx-950-with-75w-power-consumption-to-lineup }}</ref> }}) |
|||
| rowspan="4" | 2-way [[Scalable Link Interface#|SLI]] |
|||
| $159 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 950 (OEM)<ref>{{cite web|url=https://www.geforce.com/hardware/desktop-gpus/geforce-gtx-950-oem/specifications|title=GeForce GTX 950 (OEM) {{pipe}} Specifications {{pipe}} GeForce|website=geforce.com|access-date=2019-01-09|archive-url=https://web.archive.org/web/20180923200607/https://www.geforce.com/hardware/desktop-gpus/geforce-gtx-950-oem/specifications|archive-date=2018-09-23|url-status=live}}</ref> |
|||
| {{unk}} |
|||
| GM206 |
|||
| rowspan="2" | 1024:64:32 (8) |
|||
| 935 |
|||
| {{unk}} |
|||
| 5000 |
|||
| 80.0 |
|||
| 29.9<br /> |
|||
| 59.8<br /> |
|||
| 1,914.9<br />, |
|||
| 59.8<br /> |
|||
| {{unk}} |
|||
| {{okay|OEM}} |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 960<ref>{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960/specifications |title=GTX 960 {{pipe}} Specifications |publisher=GeForce |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151212024030/http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960/specifications |archive-date=2015-12-12 |url-status=live }}</ref> |
|||
| January 22, 2015 |
|||
| GM206-300 |
|||
| 1127 |
|||
| 1178 |
|||
| 7000 |
|||
| 2<br />4{{efn|name=GTX9604GB| Some manufacturers produced 4 GB versions of GTX 960. These were often criticized as useless move, as titles that would use so much VRAM and actually gain advantage over 2 GB versions, would already run too slow on those resolutions and settings, as GTX960 didn't have enough compute power and memory bandwidth to handle it.<ref>{{cite web |title=Nvidia GeForce GTX 960 2GB vs 4GB review |website=[[Eurogamer]] |date=18 October 2015 |url=https://www.eurogamer.net/articles/digitalfoundry-2015-nvidia-geforce-gtx-960-2gb-vs-4gb-review }}</ref> }} |
|||
| 112.1 |
|||
| 36.0<br />37.6 |
|||
| 72.1<br />75.3 |
|||
| 2,308.0<br />2,412.5 |
|||
| 72.1<br />75.3 |
|||
| 120 |
|||
| $199 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 960 (OEM)<ref>{{cite web |url=https://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960-oem/specifications |title=GeForce GTX 960 (OEM) {{pipe}} Specifications {{pipe}} GeForce |website=geforce.com |access-date=2019-01-09 |archive-url=https://web.archive.org/web/20180923200618/https://www.geforce.com/hardware/desktop-gpus/geforce-gtx-960-oem/specifications |archive-date=2018-09-23 |url-status=live }}</ref> |
|||
| {{unk}} |
|||
| GM204 |
|||
| rowspan="3" | 5.2 |
|||
| rowspan="3" | 398 |
|||
| 1280:80:48 (10) |
|||
| 924 |
|||
| {{unk}} |
|||
| 5000 |
|||
| 3 |
|||
| 120.0 |
|||
| 192 |
|||
| 44.3<br /> |
|||
| 73.9<br /> |
|||
| 2,365.4<br />, |
|||
| 73.9<br /> |
|||
| {{unk}} |
|||
| {{okay|OEM}} |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 970<ref>{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications |title=GTX 970 {{pipe}} Specifications |publisher=GeForce |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151207185709/http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications |archive-date=2015-12-07 |url-status=live }}</ref> |
|||
| September 18, 2014 |
|||
| GM204-200 |
|||
| 1664:104:56 (13) |
|||
| 1.75 |
|||
| 1050 |
|||
| 1178 |
|||
| rowspan="4" | 7000 |
|||
| 3.5 +<br />0.5{{efn|name=GTX960MemoryMess|For accessing its memory, the GTX 970 stripes data across 7 of its 8 32-bit physical memory lanes, at 196 GB/s. The last 1/8 of its memory (0.5 GB on a 4 GB card) is accessed on a non-interleaved solitary 32-bit connection at 28 GB/s, one seventh the speed of the rest of the memory space. Because this smaller memory pool uses the same connection as the 7th lane to the larger main pool, it contends with accesses to the larger block reducing the effective memory bandwidth not adding to it as an independent connection could.<ref>{{cite news |last1=Wasson |first1=Scott |date=January 26, 2015 |title=Nvidia: the GeForce GTX 970 works exactly as intended, A look inside the card's unusual memory config |url=http://techreport.com/review/27724/nvidia-the-geforce-gtx-970-works-exactly-as-intended |newspaper=[[The Tech Report]] |page=1 |access-date=2015-01-26 |archive-url=https://web.archive.org/web/20150128051624/http://techreport.com/review/27724/nvidia-the-geforce-gtx-970-works-exactly-as-intended |archive-date=January 28, 2015 |url-status=live }}</ref>}} |
|||
| 196.3 +<br />28.0{{efn|name=GTX960MemoryMess}} |
|||
| 224 +<br />32{{efn|name=GTX960MemoryMess}} |
|||
| 58.8<br />65.9 |
|||
| 109.2<br />122.5 |
|||
| 3,494.4<br />3,920.3 |
|||
| 109.2<br />122.5 |
|||
| 145 |
|||
| rowspan="4" | 4-way [[Scalable Link Interface#|SLI]] |
|||
| $329 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 980<ref>{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-980/specifications |title=GTX 980 {{pipe}} Specifications |publisher=GeForce |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151208184430/http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-980/specifications |archive-date=2015-12-08 |url-status=live }}</ref> |
|||
| September 18, 2014 |
|||
| GM204-400 |
|||
| 2048:128:64 (16) |
|||
| 2 |
|||
| 1126 |
|||
| 1216 |
|||
| 4 |
|||
| 224.3 |
|||
| 256 |
|||
| 72.0<br />77.8 |
|||
| 144.1<br />155.6 |
|||
| 4,612.0<br />4,980.7 |
|||
| 144.1<br />155.6 |
|||
| 165 |
|||
| $549 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 980 Ti<ref>{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-980-ti/specifications |title=GTX 980 Ti {{pipe}} Specifications |publisher=GeForce |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151211174512/http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-980-ti/specifications |archive-date=2015-12-11 |url-status=live }}</ref> |
|||
| June 1, 2015 |
|||
| GM200-310 |
|||
| rowspan="2" | 8 |
|||
| rowspan="2" | 601 |
|||
| 2816:176:96 (22) |
|||
| rowspan="2" | 3 |
|||
| rowspan="2" | 1000 |
|||
| rowspan="2" | 1075 |
|||
| 6 |
|||
| rowspan="2" | 336.5 |
|||
| rowspan="2" | 384 |
|||
| rowspan="2" | 96.0<br />103.2 |
|||
| 176.0<br />189.2 |
|||
| 5,632.0<br />6,054.4 |
|||
| 176.0<br />189.2 |
|||
| rowspan="2" | 250 |
|||
| $649 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX TITAN X<ref>{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-x/specifications |title=GTX TITAN X {{pipe}} Specifications |publisher=GeForce |access-date=2015-12-11 |archive-url=https://web.archive.org/web/20151205173930/http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-x/specifications |archive-date=2015-12-05 |url-status=live }}</ref> |
|||
| March 17, 2015 |
|||
| GM200-400 |
|||
| 3072:192:96 (24) |
|||
| 12 |
|||
| 192.0<br />206.4 |
|||
| 6,144.0<br />6,604.8 |
|||
| 192.0<br />206.4 |
|||
| $999 |
|||
|} |
|||
{{notelist|refs= |
|||
{{efn|name=CoreConfig|Main [[Unified shader model|shader processors]] : [[texture mapping unit]]s : [[render output unit]]s (streaming multiprocessors)}} |
|||
{{efn|name=PixelFillrate|Pixel fillrate is calculated as the number of ROPs multiplied by the respective core clock speed.}} |
|||
{{efn|name=TextureFillrate|Texture fillrate is calculated as the number of TMUs multiplied by the respective core clock speed.}} |
|||
{{efn|name=ProcessingPower|To calculate the processing power see [[Maxwell (microarchitecture)#Performance]].}} |
|||
{{efn|name=PerfValues|Base clock, Boost clock}} |
|||
}} |
|||
===GeForce 10 series=== |
|||
{{Further|GeForce 10 series|Pascal (microarchitecture)}} |
|||
* Supported display standards: [[DisplayPort 1.4|DP 1.4]] (no [[Display Stream Compression|DSC]]), [[HDMI 2.0b]], [[Digital Visual Interface|Dual-link DVI]]{{efn|The NVIDIA TITAN Xp and the Founders Edition GTX 1080 Ti do not have a dual-link DVI port, but a DisplayPort to single-link DVI adapter is included in the box.}}<ref>{{cite web|url=http://www.geforce.com/hardware/10series/geforce-gtx-1080|title=GTX 1080 Graphics Card|author1=[[Nvidia]]|access-date=May 7, 2016|archive-url=https://web.archive.org/web/20160507083310/http://www.geforce.com/hardware/10series/geforce-gtx-1080|archive-date=May 7, 2016|url-status=live}}</ref> |
|||
* Supported [[Application programming interface|APIs]]: [[Direct3D]] 12 (12_1), [[OpenGL]] 4.6, [[OpenCL]] 3.0, [[Vulkan (API)|Vulkan]] 1.3<ref name="vulkandrv" /> and [[CUDA]] 6.1 |
|||
* Improved [[Nvidia NVENC|NVENC]] ([[HEVC]] Main10, decode [[8K resolution|8K30]], etc.) |
|||
{| class="wikitable" style="font-size: 80%; text-align: center;" |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Process |
|||
! rowspan="2" | Transistors (billion) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | Core config{{efn|name=CoreConfig}} |
|||
! rowspan="2" | [[Computer bus|Bus]] [[I/O interface|interface]] |
|||
! rowspan="2" | [[GPU cache|L2 Cache]]<br />([[Megabyte|MB]]) |
|||
! colspan="3" | Clock speeds |
|||
! colspan="4" | Memory |
|||
! colspan="2" | [[Fillrate]]{{efn|name=PerfValues}} |
|||
! colspan="3" | Processing power ([[GFLOPS]]){{efn|name=PerfValues}}{{efn|name=ProcessingPower}} |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | [[Scalable Link Interface|SLI]] support |
|||
! colspan="2" | Release price (USD) |
|||
|- |
|||
! Base core clock ([[Hertz|MHz]]) |
|||
! Boost core clock ([[Hertz|MHz]]) |
|||
! Memory ([[Transfer (computing)|MT/s]]) |
|||
! Size ([[Gigabyte|GB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! Bus type |
|||
! Bus width ([[bit]]) |
|||
! Pixel ([[Pixel|GP]]/s){{efn|name=PixelFillrate}}{{efn|As the GTX 1070 has one of the four GP104 GPCs disabled in the die, its frontend is only able to rasterize 48 pixels per clock.<ref name="smith1">{{cite web|url=http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/29|title=The Nvidia GeForce GTX 1080 & GTX 1070 Founders Editions Review: Kicking Off the FinFET Generation|last=Smith|first=Ryan|access-date=2016-07-21|archive-url=https://web.archive.org/web/20160723082331/http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/29|archive-date=2016-07-23|url-status=live}}</ref> Analogically, the GTX 1060 features only two GPCs on its GP106 die, meaning that its frontend can only rasterize 32 pixels per clock. The remaining backend ROPs can still be used for tasks such as MSAA.<ref name="anandtech11">{{cite web |url=http://www.anandtech.com/show/10540/the-geforce-gtx-1060-founders-edition-asus-strix-review |title=The GeForce GTX 1060 Founders Edition & ASUS Strix GTX 1060 Review |access-date=2017-02-17 |archive-url=https://web.archive.org/web/20170218064526/http://www.anandtech.com/show/10540/the-geforce-gtx-1060-founders-edition-asus-strix-review |archive-date=2017-02-18 |url-status=live}}</ref>}} |
|||
! Texture ([[Texel (graphics)|GT]]/s){{efn|name=TextureFillrate}} |
|||
! [[Single precision floating-point format|Single precision]] |
|||
! [[Double precision floating-point format|Double precision]] |
|||
! [[Half precision floating-point format|Half precision]] |
|||
! MSRP |
|||
! Founders Edition |
|||
|- |
|||
! style="text-align:left;" | GeForce GT 1010<ref>{{cite web|title=NVIDIA GeForce GT 1010 Specs|url=https://www.techpowerup.com/gpu-specs/geforce-gt-1010.c3762|access-date=2021-02-14|website=TechPowerUp|language=en}}</ref> |
|||
| January 13, 2021 |
|||
| GP108 |
|||
| rowspan="6" |[[Samsung Electronics|Samsung]]<br />[[14 nm process|14LPP]] |
|||
| rowspan="3" |1.8 |
|||
| rowspan="3" |74 |
|||
| 256:16:16 |
|||
| PCIe 3.0 x4 |
|||
| 0.25 |
|||
| 1228 |
|||
| 1468 |
|||
| 5000 |
|||
| rowspan="4" |2 |
|||
| 40.1 |
|||
| GDDR5 |
|||
| rowspan="3" |64 |
|||
| 23.5 |
|||
| 23.5 |
|||
| 751.6 |
|||
| 23.5 |
|||
| ? |
|||
| 30 |
|||
| rowspan="13" {{No}} |
|||
| {{unk}} |
|||
| rowspan="11" {{N/a}} |
|||
|- |
|||
! rowspan="2" style="text-align:left;" | GeForce GT 1030<ref name="gt1030">{{cite web |url=http://www.geforce.com/hardware/desktop-gpus/geforce-gt-1030/specifications |title=GeForce GT 1030 {{pipe}} Specifications {{pipe}} GeForce |access-date=2017-05-17 |archive-url=https://web.archive.org/web/20170520204241/http://www.geforce.com/hardware/desktop-gpus/geforce-gt-1030/specifications |archive-date=2017-05-20 |url-status=live }}</ref><ref name="gt1030ddr4">{{cite web |url=https://www.techspot.com/review/1658-geforce-gt-1030-abomination/ |title=GeForce GT 1030: The DDR4 Abomination Benchmarked |access-date=2018-08-22 |archive-url=https://web.archive.org/web/20180822213906/https://www.techspot.com/review/1658-geforce-gt-1030-abomination/ |archive-date=2018-08-22 |url-status=live }}</ref> |
|||
| March 12, 2018 |
|||
| GP108-310-A1 |
|||
| rowspan="2" | 384:24:16<br />(3) (1) |
|||
| rowspan="2" | PCIe 3.0 x4<ref>{{cite web |url=http://www.palit.com/palit/vgapro.php?id=2883&lang=en&pn=NE5103000646-1080F&tab=sp |title=::Palit Products - GeForce GT 1030 :: |access-date=2017-05-26 |archive-url=https://web.archive.org/web/20170614113347/http://www.palit.com/palit/vgapro.php?id=2883&lang=en&pn=NE5103000646-1080F&tab=sp |archive-date=2017-06-14 |url-status=live }}</ref><ref>{{cite web |url=https://www.msi.com/Graphics-card/GeForce-GT-1030-AERO-ITX-2G-OC.html#hero-specification |title=Overview GeForce GT 1030 AERO ITX 2G OC |access-date=2017-05-26 |archive-url=https://web.archive.org/web/20170630025507/https://www.msi.com/Graphics-card/GeForce-GT-1030-AERO-ITX-2G-OC.html#hero-specification |archive-date=2017-06-30 |url-status=live }}</ref> |
|||
| ? |
|||
| 1152 |
|||
| 1379 |
|||
| 2100 |
|||
| 16.8 |
|||
| [[DDR4]] |
|||
| 18.4<br />22.0 |
|||
| 27.6<br />33.0 |
|||
| 884.7<br />1,059.0 |
|||
| 27.6<br />33.0 |
|||
| 13.8<br />16.5 |
|||
| 20 |
|||
| $79 |
|||
|- |
|||
| May 17, 2017 |
|||
| GP108-300-A1 |
|||
| 0.5 |
|||
| 1227 |
|||
| 1468 |
|||
| 6000 |
|||
| 48.0 |
|||
| rowspan="8" | [[GDDR5]] |
|||
| 19.6<br />23.4 |
|||
| 29.4<br />35.2 |
|||
| 942.3<br />1,127.4 |
|||
| 29.4<br />35.2 |
|||
| 14.7<br />17.6 |
|||
| 30 |
|||
| $70 |
|||
|- |
|||
! rowspan="2" style="text-align:left;" | GeForce GTX 1050<ref name="Geforce GTX 1050 Family">{{cite web|url=https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1050/|title=GeForce GTX 1050 Graphics Card|website=nvidia.com|language=en-us|access-date=2018-12-27|archive-url=https://web.archive.org/web/20161222152032/https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1050/|archive-date=2016-12-22|url-status=live}}</ref> |
|||
| October 25, 2016 |
|||
| GP107-300-A1 |
|||
| rowspan="3" | 3.3 |
|||
| rowspan="3" | 132 |
|||
| 640:40:32<br />(5) (2) |
|||
| rowspan="17" | PCIe 3.0 x16 |
|||
| 1 |
|||
| 1354 |
|||
| 1455 |
|||
| rowspan="3" | 7000 |
|||
| 112.0 |
|||
| 128 |
|||
| 43.3<br />46.6 |
|||
| 54.1<br />58.8 |
|||
| 1,733.1<br />1,862.4 |
|||
| 54.1<br />58.2 |
|||
| 27.0<br />29.1 |
|||
| rowspan="3" | 75<br />(Retail<br />up to<br />120) |
|||
| $109 |
|||
|- |
|||
| May 21, 2018 |
|||
| GP107-301-A1 |
|||
| 768:48:24<br />(6) (2) |
|||
| 0.75 |
|||
| 1392 |
|||
| 1518 |
|||
| 3 |
|||
| 84.0 |
|||
| 96 |
|||
| 33.4<br />36.4 |
|||
| 66.8<br />72.9 |
|||
| 2,138.1<br />2,331.6 |
|||
| 66.8<br />72.9 |
|||
| 33.4<br />36.4 |
|||
| {{Unk}} |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 1050 Ti<ref name="Geforce GTX 1050 Family" /> |
|||
| October 25, 2016 |
|||
| GP107-400-A1 |
|||
| 768:48:32<br />(6) (2) |
|||
| 1 |
|||
| 1290 |
|||
| 1392 |
|||
| 4 |
|||
| 112.0 |
|||
| 128 |
|||
| 41.2<br />44.5 |
|||
| 61.9<br />66.8 |
|||
| 1,981.4<br />2,138.1 |
|||
| 61.9<br />66.8 |
|||
| 30.9<br />33.4 |
|||
| $139 |
|||
|- |
|||
! rowspan="7" style="text-align:left;" | GeForce GTX 1060<br/><ref name="Geforce GTX 1060">{{cite web|url=http://www.geforce.com/hardware/10series/geforce-gtx-1060|title=GTX 1060 Graphics Card|website=nvidia.com|access-date=August 18, 2016|archive-url=https://web.archive.org/web/20160816081651/http://www.geforce.com/hardware/10series/geforce-gtx-1060|archive-date=August 16, 2016|url-status=live}}</ref><ref>{{cite web |url=https://wccftech.com/nvidia-5-gb-geforce-gtx-1060-chinese-cafes/ |title=NVIDIA Preps Cut Down, 5 GB GTX 1060 Graphics Card For Cafes |website=wccftech.com |date=26 December 2017 |access-date=2018-12-29 |archive-url=https://web.archive.org/web/20181220125200/https://wccftech.com/nvidia-5-gb-geforce-gtx-1060-chinese-cafes/ |archive-date=2018-12-20 |url-status=live }}</ref><ref>{{cite web |url=https://videocardz.com/68807/nvidia-launches-geforce-gtx-1080-11-gbps-and-gtx-1060-9-gbps |title=NVIDIA launches GeForce GTX 1080 11 Gbps and GTX 1060 9 Gbps |website=videocardz.com |date=20 April 2017 |access-date=2018-12-29 |archive-url=https://web.archive.org/web/20180902115910/https://videocardz.com/68807/nvidia-launches-geforce-gtx-1080-11-gbps-and-gtx-1060-9-gbps |archive-date=2018-09-02 |url-status=live }}</ref><ref>{{cite web |url=https://hexus.net/tech/news/graphics/123461-gigabyte-may-readying-geforce-gtx-1060-gddr5x/ |title=Gigabyte may be readying a GeForce GTX 1060 with GDDR5X |website=hexus.net |date=19 October 2018 |access-date=2018-12-29 |archive-url=https://web.archive.org/web/20181230080857/https://hexus.net/tech/news/graphics/123461-gigabyte-may-readying-geforce-gtx-1060-gddr5x/ |archive-date=2018-12-30 |url-status=live }}</ref> |
|||
| December 25, 2016 |
|||
| GP104-140-A1 |
|||
| rowspan="14" | [[TSMC]]<br />[[14 nm process|16FF]] |
|||
| 7.2 |
|||
| 314 |
|||
| rowspan="2" | 1152:72:48<br />(9) (2) |
|||
| 1.5? |
|||
| rowspan="8" | 1506 |
|||
| rowspan="7" | 1708 |
|||
| rowspan="6" | 8000 |
|||
| rowspan="2" | 3 |
|||
| rowspan="2" | 192.0 |
|||
| rowspan="2" | 192 |
|||
| rowspan="2" | 72.2<br />81.9 |
|||
| rowspan="2" | 108.4<br />122.9 |
|||
| rowspan="2" | 3,469.8<br />3,935.2 |
|||
| rowspan="2" | 108.4<br />122.9 |
|||
| rowspan="2" | 54.2<br />61.4 |
|||
| rowspan="7" | 120<br />(Retail<br />up to<br />200) |
|||
| rowspan="2" | $199 |
|||
|- |
|||
| August 18, 2016 |
|||
| GP106-300-A1 |
|||
| rowspan="2" | 4.4 |
|||
| rowspan="2" | 200 |
|||
| 1.5 |
|||
|- |
|||
| December 26, 2017 |
|||
| GP106-350-K3-A1 |
|||
| 1280:80:48<br />(10) (2) |
|||
| 1.25 |
|||
| 5 |
|||
| 160.0 |
|||
| 160 |
|||
| 60.2<br />68.3 |
|||
| rowspan="5" | 120.4<br />136.7 |
|||
| rowspan="5" | 3,855.3<br />4,375.0 |
|||
| rowspan="5" | 120.4<br />136.7 |
|||
| rowspan="5" | 60.2<br />68.3 |
|||
| {{okay|OEM}} |
|||
|- |
|||
| March 8, 2018 |
|||
| GP104-150-A1 |
|||
| rowspan="2" | 7.2 |
|||
| rowspan="2" | 314 |
|||
| rowspan="4" | 1280:80:48<br />(10) (2) |
|||
| rowspan="2" | 1.5 |
|||
| rowspan="4" | 6 |
|||
| rowspan="3" | 192.0 |
|||
| rowspan="4" | 192 |
|||
| rowspan="4" | 72.2<br />82.0 |
|||
| rowspan="2" {{unk}} |
|||
|- |
|||
| December, 2018 |
|||
| GP104-150-KA-A1 |
|||
| [[GDDR5X]] |
|||
|- |
|||
| July 19, 2016 |
|||
| GP106-400-A1 |
|||
| rowspan="2" | 4.4 |
|||
| rowspan="2" | 200 |
|||
| rowspan="2" |1.5? |
|||
| rowspan="2" | [[GDDR5]] |
|||
| $249 |
|||
| $299 |
|||
|- |
|||
| April 20, 2017 |
|||
| GP106-410-A1 |
|||
| 9000 |
|||
| 216.0 |
|||
| $299 |
|||
| {{N/a}} |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 1070<ref name="GeForce GTX 1070">{{cite web |url=https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1070-ti/ |title=GEFORCE GTX 1070 FAMILY |website=nvidia.com |access-date=2018-12-29 |archive-url=https://web.archive.org/web/20171027024554/https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1070-ti/ |archive-date=2017-10-27 |url-status=live }}</ref><ref>{{cite web |url=https://www.techpowerup.com/250266/nvidia-unveils-geforce-gtx-1070-with-gddr5x-memory |title=NVIDIA Unveils GeForce GTX 1070 with GDDR5X Memory |website=techpowerup.com |access-date=2018-12-29 |archive-url=https://web.archive.org/web/20181230082620/https://www.techpowerup.com/250266/nvidia-unveils-geforce-gtx-1070-with-gddr5x-memory |archive-date=2018-12-30 |url-status=live }}</ref> |
|||
| June 10, 2016 /<br/>Dec. 2018 |
|||
| GP104-200-A1 |
|||
| rowspan="4" | 7.2 |
|||
| rowspan="4" | 314 |
|||
| 1920:120:64<br />(15) (3) |
|||
| rowspan="4" | 2 |
|||
| rowspan="2" | 1683 |
|||
| rowspan="2" | 8000 |
|||
| rowspan="4" | 8 |
|||
| rowspan="2" | 256.0 |
|||
| [[GDDR5]]<br />[[GDDR5X]] |
|||
| rowspan="4" | 256 |
|||
| 96.3<br/>107.7 |
|||
| 180.7<br />201.9 |
|||
| 5,783.0<br />6,462.7 |
|||
| 180.7<br />201.9 |
|||
| 90.3<br />100.9 |
|||
| 150<br />(Retail<br />up to<br />250) |
|||
| rowspan="7" | 4-way SLI or 2-way [[Scalable Link Interface#SLI HB|SLI HB]]<ref>{{cite web|url=http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/3.html|title=Nvidia GeForce GTX 1080 8 GB|author=W1zzard|date=May 17, 2016|publisher=TechPowerUp|access-date=May 17, 2016|archive-url=https://web.archive.org/web/20160521031101/http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/3.html|archive-date=May 21, 2016|url-status=live}}</ref><ref>{{cite web|url=https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/23.html|title=Nvidia GeForce GTX 1080 SLI|author=W1zzard|date=June 21, 2016|publisher=TechPowerUp|access-date=June 21, 2016|archive-url=https://web.archive.org/web/20160624023141/http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/23.html|archive-date=June 24, 2016|url-status=live}}</ref> |
|||
| $379 |
|||
| $449 ($399)<ref name=":0">{{cite web|url=https://www.techarp.com/articles/geforce-gtx-1070-price-cut/|title=No, There Was No GeForce GTX 1070 Price Cut! - Tech ARP|website=www.techarp.com|date=2 November 2017|language=en-US|access-date=2017-11-05|archive-url=https://web.archive.org/web/20171107010929/https://www.techarp.com/articles/geforce-gtx-1070-price-cut/|archive-date=2017-11-07|url-status=live}}</ref> |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 1070 Ti<ref name="GeForce GTX 1070" /> |
|||
| November 2, 2017 |
|||
| GP104-300-A1 |
|||
| 2432:152:64<br />(19) (4) |
|||
| rowspan="3" | 1607 |
|||
| [[GDDR5]] |
|||
| 102.8<br/>107.7 |
|||
| 244<br />256 |
|||
| 7,816.4<br />8,186.1 |
|||
| 244.2<br />255.8 |
|||
| 122.1<br />127.9 |
|||
| rowspan="3" | 180<br />(Retail<br />up to<br />300) |
|||
| colspan="2" | $449 |
|||
|- |
|||
! rowspan="2" style="text-align:left;" | GeForce GTX 1080<ref name="GeForce GTX 1080">{{cite web|url=https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080/|title=GTX 1080 Graphics Card|author1=[[Nvidia]]|access-date=May 7, 2016|archive-url=https://web.archive.org/web/20161222152023/https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080/|archive-date=December 22, 2016|url-status=live}}</ref> |
|||
| May 27, 2016 |
|||
| GP104-400-A1 |
|||
| rowspan="2" | 2560:160:64<br />(20) (4) |
|||
| rowspan="2" | 1733 |
|||
| 10000 |
|||
| 320.0 |
|||
| rowspan="5" | [[GDDR5X]] |
|||
| rowspan="2" | 102.8<br />110.9 |
|||
| rowspan="2" | 257.1<br />277.2 |
|||
| rowspan="2" | 8,227.8<br />8,872.9 |
|||
| rowspan="2" | 257.1<br />277.2 |
|||
| rowspan="2" | 128.5<br />138.6 |
|||
| rowspan="2" | $599 ($499)<ref name=":0" /> |
|||
| rowspan="2" | $699 ($549)<ref name=":0" /> |
|||
|- |
|||
| April 20, 2017 |
|||
| GP104-410-A1 |
|||
| rowspan="2" | 11000 |
|||
| 352.0 |
|||
|- |
|||
! style="text-align:left;" | GeForce GTX 1080 Ti<ref name="GeForce GTX 1080 Ti">{{cite web|url=https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080-ti/|title=GeForce GTX 1080 Ti Graphics Card|author1=[[Nvidia]]|access-date=March 1, 2017|archive-url=https://web.archive.org/web/20170301191005/https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080-ti/|archive-date=March 1, 2017|url-status=live}}</ref> |
|||
| March 5, 2017 |
|||
| GP102-350-K1-A1 |
|||
| rowspan="3" | 12 |
|||
| rowspan="3" | 471 |
|||
| 3584:224:88<br />(28) (6) |
|||
| 2.75 |
|||
| 1480 |
|||
| 1582 |
|||
| 11 |
|||
| 484.0 |
|||
| 352 |
|||
| 130.2<br />139.2 |
|||
| 331.5<br />354.3 |
|||
| 10,608.6<br />11,339.7 |
|||
| 331.5<br />354.3 |
|||
| 165.7<br />177.1 |
|||
| rowspan="3" | 250 |
|||
| colspan="2" | $699 |
|||
|- |
|||
! style="text-align:left;" | Nvidia TITAN X<ref name="Nvidia TITAN X">{{cite web|url=http://www.geforce.com/hardware/10series/titan-x-pascal|title=Nvidia TITAN X Graphics Card|author1=[[Nvidia]]|access-date=July 21, 2016|archive-url=https://web.archive.org/web/20160722050701/http://www.geforce.com/hardware/10series/titan-x-pascal|archive-date=July 22, 2016|url-status=live}}</ref> |
|||
| August 2, 2016 |
|||
| GP102-400-A1 |
|||
| 3584:224:96<br />(28) (6) |
|||
| rowspan="2" |3 |
|||
| 1417 |
|||
| 1531 |
|||
| 10000 |
|||
| rowspan="2" | 12 |
|||
| 480.0 |
|||
| rowspan="2" | 384 |
|||
| 136.0<br />146.9 |
|||
| 317.4<br />342.9 |
|||
| 10,157.0<br />10,974.2 |
|||
| 317.4<br />342.9 |
|||
| 158.7<br />171.4 |
|||
| rowspan="2" | $1200 |
|||
| rowspan="2" {{N/A}} |
|||
|- |
|||
! style="text-align:left;" | Nvidia TITAN Xp<ref name="Nvidia TITAN Xp">{{cite web|url=https://www.nvidia.com/en-us/geforce/products/10series/titan-xp/|title=TITAN Xp Graphics Card with Pascal Architecture|author1=[[Nvidia]]|access-date=April 6, 2017|archive-url=https://web.archive.org/web/20170406223729/https://www.nvidia.com/en-us/geforce/products/10series/titan-xp/|archive-date=April 6, 2017|url-status=live}}</ref> |
|||
| April 6, 2017 |
|||
| GP102-450-A1 |
|||
| 3840:240:96<br />(30) (6) |
|||
| 1405 |
|||
| 1480 |
|||
| 11400 |
|||
| 547.7 |
|||
| 134.8<br />142.0 |
|||
| 337.2<br />355.2 |
|||
| 10,790.4<br />11,366.4 |
|||
| 337.2<br />355.2 |
|||
| 168.6<br />177.6 |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Process |
|||
! rowspan="2" | Transistors (billion) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | Core config{{efn|name=CoreConfig}} |
|||
! rowspan="2" | [[Computer bus|Bus]] [[I/O interface|interface]] |
|||
! rowspan="2" | [[GPU cache|L2 Cache]]<br />([[Megabyte|MB]]) |
|||
! Base core clock ([[Hertz|MHz]]) |
|||
! Boost core clock ([[Hertz|MHz]]) |
|||
! Memory ([[Transfer (computing)|MT/s]]) |
|||
! Size ([[Gigabyte|GB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! Bus type |
|||
! Bus width ([[bit]]) |
|||
! Pixel ([[Pixel|GP]]/s){{efn|name=PixelFillrate}}{{efn|As the GTX 1070 has one of the four GP104 GPCs disabled in the die, its frontend is only able to rasterize 48 pixels per clock.<ref name="smith1"/> Analogically, the GTX 1060 features only two GPCs on its GP106 die, meaning that its frontend can only rasterize 32 pixels per clock. The remaining backend ROPs can still be used for tasks such as MSAA.<ref name="anandtech11"/>}} |
|||
! Texture ([[Texel (graphics)|GT]]/s){{efn|name=TextureFillrate}} |
|||
! [[Single precision floating-point format|Single precision]] |
|||
! [[Double precision floating-point format|Double precision]] |
|||
! [[Half precision floating-point format|Half precision]] |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | [[Scalable Link Interface|SLI]] support |
|||
! MSRP |
|||
! Founders Edition |
|||
|- |
|||
! colspan="3" | Clock speeds |
|||
! colspan="4" | Memory |
|||
! colspan="2" | [[Fillrate]]{{efn|name=PerfValues}} |
|||
! colspan="3" | Processing power ([[GFLOPS]]){{efn|name=PerfValues}}{{efn|name=ProcessingPower}} |
|||
! colspan="2" | Release price (USD) |
|||
|} |
|||
{{notelist|refs= |
|||
{{efn|name=CoreConfig|Main [[Unified shader model|shader processors]] : [[texture mapping unit]]s : [[render output unit]]s (streaming multiprocessors) (graphics processing clusters)}} |
|||
{{efn|name=PixelFillrate|Pixel fillrate is calculated as the lowest of three numbers: number of ROPs multiplied by the base core clock speed, number of rasterizers multiplied by the number of fragments they can generate per rasterizer multiplied by the base core clock speed, and the number of streaming multiprocessors multiplied by the number of fragments per clock that they can output multiplied by the base clock rate.}} |
|||
{{efn|name=TextureFillrate|Texture fillrate is calculated as the number of TMUs multiplied by the base core clock speed.}} |
|||
{{efn|name=ProcessingPower|To calculate the processing power see [[Pascal (microarchitecture)#Performance]].}} |
|||
{{efn|name=PerfValues|Base clock, Boost clock}} |
|||
}} |
|||
===Volta series=== |
|||
{{Further|Volta (microarchitecture)}} |
|||
* Supported [[Application programming interface|APIs]]: [[Direct3D]] 12 (12_1), [[OpenGL]] 4.6, [[OpenCL]] 3.0, [[Vulkan (API)|Vulkan]] 1.3<ref name="vulkandrv" /> and [[CUDA]] 7.0 |
|||
{| class="wikitable" style="font-size: 80%; text-align: center;" |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Process |
|||
! rowspan="2" | Transistors (billion) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | Core config{{efn|name=CoreConfig}} |
|||
! rowspan="2" | [[Computer bus|Bus]] [[I/O interface|Interface]] |
|||
! rowspan="2" | [[GPU cache|L2 Cache]]<br />([[Megabyte|MB]]) |
|||
! colspan="3" | Clock speeds |
|||
! colspan="4" | Memory |
|||
! colspan="2" |[[Fillrate]]{{efn|name=PerfValues}} |
|||
! colspan="4" | Processing power ([[GFLOPS]]){{efn|name=PerfValues}} |
|||
! rowspan="2" |[[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | [[NVLink]] Support |
|||
! colspan="2" | Release price (USD) |
|||
|- |
|||
! Base core clock ([[Hertz|MHz]]) |
|||
! Boost core clock ([[Hertz|MHz]]) |
|||
! Memory ([[Transfer (computing)|MT/s]]) |
|||
! Size ([[Gigabyte|GB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! Bus type |
|||
! Bus width ([[bit]]) |
|||
! Pixel ([[Pixel|GP]]/s){{efn|name=PixelFillrate}} |
|||
! Texture ([[Texel (graphics)|GT]]/s){{efn|name=TextureFillrate}} |
|||
! [[Single-precision floating-point format|Single precision]] |
|||
! [[Double-precision floating-point format|Double precision]] |
|||
! [[Half-precision floating-point format|Half precision]] |
|||
! [[Tensor]] compute + Single precision |
|||
! MSRP |
|||
! Founders Edition |
|||
|- |
|||
! style="text-align:left;" | Nvidia TITAN V<ref>{{cite web|url=https://www.nvidia.com/en-us/titan/titan-v|title=Nvidia TITAN V Graphics Card|author1=[[Nvidia]]|access-date=December 8, 2017|archive-url=https://web.archive.org/web/20171208093132/https://www.nvidia.com/en-us/titan/titan-v/|archive-date=December 8, 2017|url-status=live}}</ref> |
|||
| December 7, 2017 |
|||
| GV100-400-A1 |
|||
| rowspan="2" | [[TSMC]]<br />[[14 nm process|12FFN]] |
|||
| rowspan="2" | 21.1 |
|||
| rowspan="2" | 815 |
|||
| 5120:320:96:640<br />(80) (6) |
|||
| rowspan="2" | PCIe 3.0 x16 |
|||
| 4.5 |
|||
| rowspan="2" | 1200 |
|||
| rowspan="2" | 1455 |
|||
| rowspan="2" | 1700 |
|||
| 12 |
|||
| 652.8 |
|||
| rowspan="2" |[[HBM2]] |
|||
| 3072 |
|||
| rowspan="2" | 153.6<br />186.2 |
|||
| rowspan="2" | 384.0<br />465.6 |
|||
| rowspan="2" | 12,288.0<br />14,899.2 |
|||
| rowspan="2" | 6,144.0<br />7,449.6 |
|||
| rowspan="2" | 24,576.0<br />29,798.4 |
|||
| rowspan="2" | 110,592.0<br />134,092.8 |
|||
| rowspan="2" | 250 |
|||
| {{No}} |
|||
| $2999 |
|||
| {{N/a}} |
|||
|- |
|||
! style="text-align:left;" | Nvidia TITAN V<br>CEO Edition<ref>{{Cite news|url=https://www.anandtech.com/show/13004/nvidia-limited-edition-32gb-titan-v-ceo-edition|title=NVIDIA Unveils & Gives Away New Limited Edition 32GB Titan V "CEO Edition"|last=Smith|first=Ryan|access-date=2018-08-08|archive-url=https://web.archive.org/web/20180730215157/https://www.anandtech.com/show/13004/nvidia-limited-edition-32gb-titan-v-ceo-edition|archive-date=2018-07-30|url-status=live}}</ref><ref>{{Cite news|url=https://www.techpowerup.com/gpudb/3277/titan-v-ceo-edition|title=NVIDIA TITAN V - CEO Edition|work=TechPowerUp|access-date=2018-08-08|language=en}}{{dead link|date=June 2022|bot=medic}}{{cbignore|bot=medic}}</ref> |
|||
| June 21, 2018 |
|||
| GV-100-???-A1 |
|||
| 5120:320:128:640<br />(80) (6) |
|||
| 6 |
|||
| 32 |
|||
| 870.4 |
|||
| 4096 |
|||
| {{No}} |
|||
| colspan="2" {{N/a}} |
|||
|} |
|||
{{notelist|refs= |
|||
{{efn|name=CoreConfig|Main [[Unified shader model|shader processors]] : [[texture mapping unit]]s : [[render output unit]]s : [[tensor core]]s (streaming multiprocessors) (graphics processing clusters)}} |
|||
{{efn|name=PixelFillrate|Pixel fillrate is calculated as the lowest of three numbers: number of ROPs multiplied by the base core clock speed, number of rasterizers multiplied by the number of fragments they can generate per rasterizer multiplied by the base core clock speed, and the number of streaming multiprocessors multiplied by the number of fragments per clock that they can output multiplied by the base clock rate.}} |
|||
{{efn|name=TextureFillrate|Texture fillrate is calculated as the number of TMUs multiplied by the base core clock speed.}} |
|||
{{efn|name=PerfValues|Base clock, Boost clock}} |
|||
}} |
|||
===GeForce 16 series=== |
|||
{{Further|GeForce 16 series|Turing (microarchitecture)}} |
|||
* Supported [[Application programming interface|APIs]]: [[Direct3D]] 12 (feature level 12_1), [[OpenGL]] 4.6, [[OpenCL]] 3.0, [[Vulkan (API)|Vulkan]] 1.3<ref name="vulkandrv" /> and [[CUDA]] 7.5 |
|||
* [[Nvidia NVENC|NVENC]] 6th generation ([[B frame|B-frame]], etc.) |
|||
* TU117 only supports Volta [[Nvidia NVENC|NVENC]] (5th generation) |
|||
{| class="wikitable" style="font-size: 80%; text-align: center;" |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Process |
|||
! rowspan="2" | Transistors (billion) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | Core config{{efn|name=CoreConfig}} |
|||
! rowspan="2" | [[Bus (computing)|Bus]] [[Input/output|interface]] |
|||
! rowspan="2" | [[GPU cache|L2 Cache]] ([[Megabyte|MB]]) |
|||
! colspan="3" | Clock speeds |
|||
! colspan="4" | Memory |
|||
! colspan="2" | [[Fillrate]]{{efn|name=PerfValues}} |
|||
! colspan="3" | Processing power (G[[FLOPS]]){{efn|name=PerfValues}} |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | [[NVLink]] support |
|||
! rowspan="2" | Release price (USD) |
|||
|- |
|||
! Base core clock ([[Hertz|MHz]]) |
|||
! Boost core clock ([[Hertz|MHz]]) |
|||
! Memory ([[Transfer (computing)|GT/s]]) |
|||
! Size ([[Gigabyte|GB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! Bus type |
|||
! Bus width ([[bit]]) |
|||
! Pixel ([[Pixel|GP]]/s){{efn|name=PixelFillrate}} |
|||
! Texture ([[Texel (graphics)|GT]]/s){{efn|name=TextureFillrate}} |
|||
! [[Single-precision floating-point format|Single precision]] |
|||
! [[Double-precision floating-point format|Double precision]] |
|||
! [[Half-precision floating-point format|Half precision]] |
|||
|- |
|||
! style="text-align:left;" |GeForce GTX 1630 |
|||
| June 28, 2022<ref>{{cite web |title=NVIDIA officially launches GeForce GTX 1630 graphics card |url=https://videocardz.com/newz/nvidia-officially-launches-geforce-gtx-1630-graphics-card |access-date=2022-06-28 |website=VideoCardz.com |language=en-US}}</ref> |
|||
| TU117-150-A1 |
|||
| rowspan="8" |[[TSMC]]<br />[[14 nm process|12FFN]] |
|||
| rowspan="3" |4.7 |
|||
| rowspan="3" |200 |
|||
| 512:32:16:1024:0<br />(8) (?) |
|||
| rowspan="8" |PCIe 3.0 x16 |
|||
| rowspan="4" |1 |
|||
| 1740 |
|||
| 1785 |
|||
| 12 |
|||
| rowspan="5" |4 |
|||
| 96 |
|||
| [[GDDR6 SDRAM|GDDR6]] |
|||
| 64 |
|||
| 28.56 |
|||
| 57.12 |
|||
| 1,781.76<br />1,827.84 |
|||
| 55.68<br />57.12 |
|||
| 3,563.52<br />3,655.68 |
|||
| rowspan="3" |75 |
|||
| rowspan="8" {{No}} |
|||
| {{Dunno}} |
|||
|- |
|||
! rowspan="3" style="text-align:left;" |GeForce GTX 1650<ref>{{cite web|url=https://www.nvidia.com/en-us/geforce/graphics-cards/gtx-1650/|title=NVIDIA GeForce GTX 1650 Graphics Card|website=NVIDIA|access-date=2019-04-23|archive-url=https://web.archive.org/web/20190423131601/https://www.nvidia.com/en-us/geforce/graphics-cards/gtx-1650/|archive-date=2019-04-23|url-status=live}}</ref> |
|||
| April 23, 2019 |
|||
| rowspan="2" |TU117-300-A1 |
|||
| rowspan="3" |896:56:32:1792:0<br />(14) (2) |
|||
| 1485 |
|||
| 1665 |
|||
| 8 |
|||
| 128 |
|||
| [[GDDR5 SDRAM|GDDR5]] |
|||
| rowspan="4" |128 |
|||
| 47.52 |
|||
| 83.16 |
|||
| 2,661.00<br />2,984.00 |
|||
| 83.16<br />93.24 |
|||
| 5,322.00<br />5,967.00 |
|||
| rowspan="3" |$149 |
|||
|- |
|||
| April 3, 2020<ref>{{cite web|url=https://www.anandtech.com/show/15701/nvidias-geforce-gtx-1650-gddr6-released-gddr5-price-parity|title=NVIDIA's GeForce GTX 1650 GDDR6 Released: GDDR6 Reaching Price Parity With GDDR5|website=Anandtech|access-date=2020-04-06}}</ref> |
|||
| rowspan="2" |1410 |
|||
| rowspan="2" |1590 |
|||
| rowspan="3" |12 |
|||
| rowspan="4" |192 |
|||
| rowspan="3" |[[GDDR6 SDRAM|GDDR6]] |
|||
| 45.12 |
|||
| 78.96 |
|||
| rowspan="2" | 2,526.72<br />2,849.28 |
|||
| rowspan="2" | 78.96<br />89.04 |
|||
| rowspan="2" | 5,053.44<br />5,698.56 |
|||
|- |
|||
| June 18, 2020<ref>{{cite web|title=NVIDIA GeForce GTX 1650 TU106 Specs|url=https://www.techpowerup.com/gpu-specs/geforce-gtx-1650-tu106.c3585|access-date=2021-12-14|website=TechPowerUp|language=en}}</ref> |
|||
| TU106-125-A1 |
|||
| 10.8 |
|||
| 445 |
|||
| 50.88 |
|||
| 89.04 |
|||
| 90 |
|||
|- |
|||
! style="text-align:left;" |GeForce GTX 1650 Super<ref>{{cite web|url=https://www.nvidia.com/en-gb/geforce/graphics-cards/gtx-1650-super/|title=NVIDIA GeForce GTX 1650 SUPER Graphics Card|website=NVIDIA|access-date=2019-10-29}}</ref> |
|||
| November 22, 2019<ref>{{cite web|url=https://www.anandtech.com/show/15041/nvidia-announces-geforce-gtx-1650-super-launching-november-22nd|title=NVIDIA Announces GeForce GTX 1650 Super: Launching November 22nd|access-date=2019-10-29}}</ref> |
|||
| TU116-250-KA-A1<ref>{{cite web|url=https://www.cnet.com/news/gpu-memory-memory-bandwidth-memory-clock-gpu-clock-speed-memory-data-rateinterface-texture-fill-rate-ray-tracing-rt/|title=GTX 1660, 1650 Super boost speeds for Nvidia's cheapest gaming cards|access-date=2019-10-29}}</ref> |
|||
| rowspan="4" |6.6 |
|||
| rowspan="4" |284 |
|||
| 1280:80:32:2560:0<br />(20) (3) |
|||
| rowspan="4" |1.5 |
|||
| rowspan="3" |1530 |
|||
| 1725 |
|||
| 48.96 |
|||
| 122.40 |
|||
| 3,916.80<br />4,416.00 |
|||
| 122.40<br />138.00 |
|||
| 7,833.60 <br />8,832.00 |
|||
| 100 |
|||
| $159 |
|||
|- |
|||
! style="text-align:left;" |GeForce GTX 1660<ref>{{cite web|url=https://www.nvidia.com/en-us/geforce/graphics-cards/gtx-1660-ti/|title=The GeForce 16 Series Graphics Cards are Here|website=NVIDIA|language=en-us|access-date=2019-03-23|archive-url=https://web.archive.org/web/20190325235255/https://www.nvidia.com/en-us/geforce/graphics-cards/gtx-1660-ti/|archive-date=2019-03-25|url-status=live}}</ref> |
|||
| March 14, 2019 |
|||
| TU116-300-A1 |
|||
| rowspan="2" |1408:88:48:2816:0<br />(22) (3) |
|||
| rowspan="2" |1785 |
|||
| 8 |
|||
| rowspan="3" |6 |
|||
| [[GDDR5 SDRAM|GDDR5]] |
|||
| rowspan="3" |192 |
|||
| rowspan="2" |73.44 |
|||
| rowspan="2" |134.64 |
|||
| rowspan="2" |4,308.00<br />5,027.00 |
|||
| rowspan="2" |134.64<br />157.08 |
|||
| rowspan="2" |8,616.00<br />10,053.00 |
|||
| 120 |
|||
| $219 |
|||
|- |
|||
! style="text-align:left;" |GeForce GTX 1660 Super<ref>{{cite web|url=https://www.nvidia.com/en-gb/geforce/graphics-cards/gtx-1660-super/|title=NVIDIA GeForce GTX 1660 SUPER Graphics Card|website=NVIDIA|access-date=2019-10-29}}</ref> |
|||
| October 29, 2019 |
|||
| TU116-300-A1 |
|||
| 14 |
|||
| 336 |
|||
| rowspan="2" |[[GDDR6 SDRAM|GDDR6]] |
|||
| 125 |
|||
| $229 |
|||
|- |
|||
! style="text-align:left;" |GeForce GTX 1660 Ti<ref>{{cite web|url=https://www.nvidia.com/en-us/geforce/graphics-cards/gtx-1660-ti/|title=NVIDIA GeForce RTX 1660 Graphics Card|website=NVIDIA|access-date=2019-02-22|archive-url=https://web.archive.org/web/20190222141328/https://www.nvidia.com/en-us/geforce/graphics-cards/gtx-1660-ti/|archive-date=2019-02-22|url-status=live}}</ref> |
|||
| February 21, 2019 |
|||
| TU116-400-A1 |
|||
| 1536:96:48:3072:0<br />(24) (3) |
|||
| 1500 |
|||
| 1770 |
|||
| 12 |
|||
| 288 |
|||
| 72.00 |
|||
| 144.00 |
|||
| 4,608.00<br />5,437.44 |
|||
| 144.00<br />169.92 |
|||
| 9,216.00<br />10,874.88 |
|||
| 120 |
|||
| $279 |
|||
|} |
|||
===GeForce 20 series=== |
|||
{{Further|GeForce 20 series|Turing (microarchitecture)}} |
|||
* Supported [[Application programming interface|APIs]]: [[Direct3D]] 12 Ultimate (12_2), [[OpenGL]] 4.6, [[OpenCL]] 3.0, [[Vulkan (API)|Vulkan]] 1.3<ref name="vulkandrv" /> and [[CUDA]] 7.5 |
|||
* Unlike previous generations the RTX Non-Super (RTX 2070, RTX 2080, RTX 2080 Ti) Founders Edition cards no longer have reference clocks, but are "Factory-OC". However, RTX Supers (RTX 2060 Super, RTX 2070 Super, and RTX 2080 Super) Founders Edition are reference clocks. |
|||
* [[NVENC]] 6th generation ([[B frame|B-frame]], etc.) |
|||
{| class="wikitable" style="font-size: 80%; text-align: center;" |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Process |
|||
! rowspan="2" | Transistors (billion) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | Core config{{efn|name=CoreConfig}} |
|||
! rowspan="2" | [[Computer bus|Bus]] [[I/O interface|interface]] |
|||
! rowspan="2" | [[GPU cache|L2 Cache]]<br />([[Megabyte|MB]]) |
|||
! colspan="3" | Clock speeds |
|||
! colspan="4" | Memory |
|||
! colspan="2" | [[Fillrate]]{{efn|name=PerfValues}} |
|||
! colspan="4" | Processing power ([[GFLOPS]]){{efn|name=PerfValues}} |
|||
! colspan="2" | [[Ray tracing (graphics)|Ray-tracing]] Performance |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | [[NVLink]] support |
|||
! colspan="2" | Release price (USD) |
|||
|- |
|||
! Base core clock ([[Hertz|MHz]]) |
|||
! Boost core clock ([[Hertz|MHz]]) |
|||
! Memory ([[Transfer (computing)|GT/s]]) |
|||
! Size ([[Gigabyte|GB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! Bus type |
|||
! Bus width ([[bit]]) |
|||
! Pixel ([[Pixel|GP]]/s){{efn|name=PixelFillrate}} |
|||
! Texture ([[Texel (graphics)|GT]]/s){{efn|name=TextureFillrate}} |
|||
! [[Single precision floating-point format|Single precision]] |
|||
! [[Double precision floating-point format|Double precision]] |
|||
! [[Half precision floating-point format|Half precision]] |
|||
! [[Tensor]] compute (FP16) |
|||
! Rays/s (Billions) |
|||
! RTX OPS/s (Trillions) |
|||
! MSRP |
|||
! Founders Edition |
|||
|- |
|||
! rowspan="3" style="text-align:left;" | GeForce RTX 2060<ref name=":3">{{cite web|url=https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2060/|title=NVIDIA GeForce RTX 2060 Graphics Card|website=NVIDIA|access-date=2019-01-08|archive-url=https://web.archive.org/web/20190107124509/https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2060/|archive-date=2019-01-07|url-status=live}}</ref> |
|||
| January 15, 2019 |
|||
| TU106-200-KA-A1 |
|||
| rowspan="10" | [[TSMC]]<br />[[14 nm process|12FFN]] |
|||
| 10.8 |
|||
| 445 |
|||
| rowspan="2" | 1920:120:48:240:30<br />(30) (3) |
|||
| rowspan="10" | PCIe 3.0 x16 |
|||
| rowspan="3" | 3 |
|||
| rowspan="2" | 1365 |
|||
| rowspan="2" | 1680 |
|||
| rowspan="7" | 14 |
|||
| rowspan="2" | 6 |
|||
| rowspan="3" | 336.0 |
|||
| rowspan="10" | [[GDDR6 SDRAM|GDDR6]] |
|||
| rowspan="3" | 192 |
|||
| rowspan="2" | 65.52<br />80.64 |
|||
| rowspan="2" | 163.80<br />201.60 |
|||
| rowspan="2" | 5 241.60<br />6,451.20 |
|||
| rowspan="2" | 163.80<br />201.60 |
|||
| rowspan="2" | 10 483.20<br />12 902.40 |
|||
| rowspan="2" | 41 932.80<br />51 609.60 |
|||
| rowspan="2" | 5 |
|||
| rowspan="2" | 37 |
|||
| rowspan="2" | 160 |
|||
| rowspan="5" {{No}} |
|||
| $349 |
|||
| rowspan="3" {{N/a}} |
|||
|- |
|||
| January 10, 2020 |
|||
| TU104-150-KC-A1<ref>{{cite web|url=https://www.techpowerup.com/gpu-specs/geforce-rtx-2060-tu104.c3495|title=NVIDIA GeForce RTX 2060 TU104 Specs|website=TechPowerUp|language=en|access-date=2020-02-18}}</ref> |
|||
| 13.6 |
|||
| 545 |
|||
| rowspan="2" | $299 |
|||
|- |
|||
| December 7, 2021<ref>{{cite web|url=https://videocardz.com/newz/nvidia-launches-geforce-rtx-2060-12gb-a-perfect-card-for-crypto-miners|title=NVIDIA launches GeForce RTX 2060 with 12GB and TU106-300 GPU, overpriced gaming GPU for miners|website=videocardz|language=en|access-date=2021-07-09}}</ref> |
|||
| TU106-300-KA-A1 |
|||
| rowspan="3" | 10.8 |
|||
| rowspan="3" | 445 |
|||
| rowspan="2" | 2176:136:64:272:34<br />(34) (3) |
|||
| rowspan="2" | 1470 |
|||
| rowspan="2" | 1650 |
|||
| 12 |
|||
| <br />79.20 |
|||
| rowspan="2" | 199.92<br />224.40 |
|||
| rowspan="2" | 6 400.00<br />7 180.00 |
|||
| rowspan="2" | 199.92<br />224.40 |
|||
| rowspan="2" | 12 800.00<br />14 360.00 |
|||
| |
|||
| |
|||
| |
|||
| 185 |
|||
|- |
|||
! style="text-align:left;" | GeForce RTX 2060 Super<ref name="anandtech_super">{{cite web|url=https://www.anandtech.com/show/14586/geforce-rtx-2070-super-rtx-2060-super-review|title=The GeForce RTX 2070 Super & RTX 2060 Super Review|website=Anandtech|date=2019-07-02|access-date=2019-07-02|archive-url=https://web.archive.org/web/20190702140732/https://www.anandtech.com/show/14586/geforce-rtx-2070-super-rtx-2060-super-review|archive-date=2019-07-02|url-status=live}}</ref><ref>{{cite web |url=https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2060-super/ |title=Your Graphics, Now with SUPER Powers |access-date=2019-07-02 |archive-url=https://web.archive.org/web/20190702140732/https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2060-super/ |archive-date=2019-07-02 |url-status=live }}</ref> |
|||
| July 9, 2019 |
|||
| TU106-410-A1 |
|||
| rowspan="5" | 4 |
|||
| rowspan="5" | 8 |
|||
| rowspan="4" | 448.0 |
|||
| rowspan="5" | 256 |
|||
| 94.08<br />105.60 |
|||
| 51,200.00<br />57,440.00 |
|||
| rowspan="2" | 6 |
|||
| 41 |
|||
| rowspan="2" | 175 |
|||
| colspan="2" | $399 |
|||
|- |
|||
! style="text-align:left;" | GeForce RTX 2070<ref name="Geforce RTX 2070">{{cite web|url=https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2070/|title=Introducing NVIDIA GeForce RTX 2070 Graphics Card|website=NVIDIA|language=en-us|access-date=2018-08-20|archive-url=https://web.archive.org/web/20180820234826/https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2070/|archive-date=2018-08-20|url-status=live}}</ref><ref>{{cite web|url=https://www.pcgamer.com/nvidia-turing-architecture-deep-dive/|title=Nvidia Turing architecture deep dive|website=pcgamer.com|language=en-us|access-date=2018-09-27|archive-url=https://web.archive.org/web/20180918231242/https://www.pcgamer.com/nvidia-turing-architecture-deep-dive/|archive-date=2018-09-18|url-status=live}}</ref> |
|||
| October 17, 2018 |
|||
| TU106-400-A1 |
|||
| 2304:144:64:288:36<br />(36) (3) |
|||
| 1410 |
|||
| 1620 |
|||
| 90.24<br />103.68 |
|||
| 203.04<br />233.28 |
|||
| 6,497.28<br />7,464.96 |
|||
| 203.04<br />233.28 |
|||
| 12,994.56<br />14,929.92 |
|||
| 51,978.24<br />59,719.68 |
|||
| 42 |
|||
| rowspan="2" | $499 |
|||
| $599 |
|||
|- |
|||
! style="text-align:left;"| GeForce RTX 2070 Super<ref name="anandtech_super" /><ref>{{cite web |url=https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2070-super/ |title=Your Graphics, Now with SUPER Powers |access-date=2019-07-02 |archive-url=https://web.archive.org/web/20190702140730/https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2070-super/ |archive-date=2019-07-02 |url-status=live }}</ref> |
|||
| July 9, 2019 |
|||
| TU104-410-A1 |
|||
| rowspan="3" | 13.6 |
|||
| rowspan="3" | 545 |
|||
| 2560:160:64:320:40<br />(40) (5) |
|||
| 1605 |
|||
| 1770 |
|||
| 102.70<br />113.28 |
|||
| 256.80<br />283.20 |
|||
| 8,220.00<br />9,060.00 |
|||
| 256.80<br />283.20 |
|||
| 16,440.00<br />18,120.00 |
|||
| 65,760.00<br />72,480.00 |
|||
| 7 |
|||
| 52 |
|||
| rowspan="2" | 215 |
|||
| rowspan="5" | 2-way [[NVLink]] |
|||
| $499 |
|||
|- |
|||
! style="text-align:left;"| GeForce RTX 2080<ref name="Geforce RTX 2080">{{cite web|url=https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2080/|title=NVIDIA GeForce RTX 2080 Founders Edition Graphics Card|website=NVIDIA|language=en-us|access-date=2018-08-20|archive-url=https://web.archive.org/web/20180820234828/https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2080/|archive-date=2018-08-20|url-status=live}}</ref><ref>{{cite web|url=https://www.pcmag.com/news/363762/nvidia-can-automatically-overclock-your-geforce-rtx-in-20-mi|title=Nvidia Can Automatically Overclock Your GeForce RTX in 20 Minutes|website=PCmag.com|language=en-us|access-date=2018-09-20|archive-url=https://web.archive.org/web/20180920160829/https://www.pcmag.com/news/363762/nvidia-can-automatically-overclock-your-geforce-rtx-in-20-mi|archive-date=2018-09-20|url-status=live}}</ref> |
|||
| September 20, 2018 |
|||
| TU104-400-A1 |
|||
| 2944:184:64:368:46<br />(46) (6) |
|||
| 1515 |
|||
| 1710 |
|||
| 96.96<br />109.44 |
|||
| 278.76<br />314.64 |
|||
| 8,920.32<br />10,068.48 |
|||
| 278.76<br />314.64 |
|||
| 17,840.64<br />20,136.96 |
|||
| 71,362.56<br />80,547.84 |
|||
| rowspan="2" | 8 |
|||
| 57 |
|||
| rowspan="2" | $699 |
|||
| $799 |
|||
|- |
|||
! style="text-align:left;"| GeForce RTX 2080 Super<ref name="anandtech_super" /><ref>{{cite web |url=https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2080-super/ |title=Your Graphics, Now with SUPER Powers |access-date=2019-07-02 |archive-url=https://web.archive.org/web/20190702140731/https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2080-super/ |archive-date=2019-07-02 |url-status=live }}</ref> |
|||
| July 23, 2019 |
|||
| TU104-450-A1 |
|||
| 3072:192:64:384:48<br />(48) (6) |
|||
| 1650 |
|||
| 1815 |
|||
| 15.5 |
|||
| 496.0 |
|||
| 105.60<br />116.16 |
|||
| 316.80<br />348.48 |
|||
| 10,140.00<br />11,150.00 |
|||
| 316.80<br />348.50 |
|||
| 20,280.00<br />22,300.00 |
|||
| 81,120.00<br />89,200.00 |
|||
| 63 |
|||
| rowspan="2" | 250 |
|||
| $699 |
|||
|- |
|||
! style="text-align:left;"| GeForce RTX 2080 Ti<ref name="Geforce RTX 2080 Ti">{{cite web|url=https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2080-ti/|title=Graphics Reinvented: NVIDIA GeForce RTX 2080 Ti Graphics Card|website=NVIDIA|language=en-us|access-date=2018-08-20|archive-url=https://web.archive.org/web/20180820170549/https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2080-ti/|archive-date=2018-08-20|url-status=live}}</ref> |
|||
| September 27, 2018 |
|||
| TU102-300-K1-A1 |
|||
| rowspan="2" | 18.6 |
|||
| rowspan="2" | 754 |
|||
| 4352:272:88:544:68<br />(68) (6) |
|||
| 5.5 |
|||
| rowspan="2" | 1350 |
|||
| 1545 |
|||
| rowspan="2" | 14 |
|||
| 11 |
|||
| 616.0 |
|||
| 352 |
|||
| 118.80<br />135.96 |
|||
| 367.20<br />420.24 |
|||
| 11,750.40<br />13,447.68 |
|||
| 367.20<br />420.24 |
|||
| 23,500.80<br />26,895.36 |
|||
| 94,003.20<br />107,581.44 |
|||
| 10 |
|||
| 76 |
|||
| $999 |
|||
| $1,199 |
|||
|- |
|||
! style="text-align:left;"| Nvidia TITAN RTX<ref name="Titan RTX">{{cite web|url=https://www.nvidia.com/en-us/titan/titan-rtx/|title=TITAN RTX Ultimate PC Graphics Card with Turing: NVIDIA|website=nvidia.com|access-date=2018-12-27|archive-url=https://web.archive.org/web/20181226141249/https://www.nvidia.com/en-us/titan/titan-rtx/|archive-date=2018-12-26|url-status=live}}</ref> |
|||
| December 18, 2018 |
|||
| TU102-400-A1 |
|||
| 4608:288:96:576:72<br />(72) (6) |
|||
| 6 |
|||
| 1770{{efn|name=TitanRTXBoost}} |
|||
| 24 |
|||
| 672.0 |
|||
| 384 |
|||
| 129.60<br />169.92 |
|||
| 388.80<br />509.76 |
|||
| 12 441.60<br />16 312.32 |
|||
| 388.80<br />509.76 |
|||
| 24 883.20<br />32 624.64 |
|||
| 99 532.80<br />130 498.56 |
|||
| 11 |
|||
| 84 |
|||
| 280 |
|||
| {{N/a}} |
|||
| $2,499 |
|||
|} |
|||
{{notelist|refs= |
|||
{{efn|name=CoreConfig|Main [[Unified shader model|shader processors]] : [[texture mapping unit]]s : [[render output unit]]s : [[tensor core]]s (or FP16 cores in GeForce 16 series) : [[Ray tracing (graphics)|ray-tracing]] cores (streaming multiprocessors) (graphics processing clusters)}} |
|||
{{efn|name=PixelFillrate|Pixel fillrate is calculated as the lowest of three numbers: number of ROPs multiplied by the base core clock speed, number of rasterizers multiplied by the number of fragments they can generate per rasterizer multiplied by the base core clock speed, and the number of streaming multiprocessors multiplied by the number of fragments per clock that they can output multiplied by the base clock rate.}} |
|||
{{efn|name=TextureFillrate|Texture fillrate is calculated as the number of TMUs multiplied by the base core clock speed.}} |
|||
{{efn|name=PerfValues|Base clock, Boost clock}} |
|||
{{efn|name=TitanRTXBoost|Boost of the Founders Editions, as there is no reference version of this card.}} |
|||
}} |
|||
=== GeForce 30 series === |
|||
{{Further|GeForce 30 series|Ampere (microarchitecture)}} |
|||
* Supported [[Application programming interface|APIs]]: [[Direct3D]] 12 Ultimate (12_2), [[OpenGL]] 4.6, [[OpenCL]] 3.0, [[Vulkan (API)|Vulkan]] 1.3<ref name="vulkandrv" /> and [[CUDA]] 8.6 |
|||
* Supported display connections: [[HDMI]] 2.1, [[DisplayPort]] 1.4a |
|||
* [[NVENC]] 7th generation |
|||
* [[Tensor Core|Tensor core]] 3rd gen |
|||
* [[RT core]] 2nd gen |
|||
* RTX IO |
|||
* Improved [[Nvidia NVDEC|NVDEC]] with [[AV1]] decode |
|||
* NVIDIA [[DLSS]] 2.0 |
|||
{| class="wikitable" style="font-size: 80%; text-align: center;" |
|||
|- |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Process |
|||
! rowspan="2" | Transistors (billion) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | Core config{{efn|name=CoreConfig}} |
|||
! rowspan="2" | [[Bus (computing)|Bus]] [[Input/output|interface]] |
|||
! rowspan="2" | [[GPU cache|L2 Cache]]<br />([[Megabyte|MB]]) |
|||
! colspan="3" | Clock speeds |
|||
! colspan="4" | Memory |
|||
! colspan="2" | [[Fillrate]] |
|||
! colspan="4" | Processing power (T[[FLOPS]]) |
|||
! colspan="3" | [[Ray tracing (graphics)|Ray-tracing]] Performance |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! rowspan="2" | [[NVLink]] support |
|||
! colspan="2" | Release price (USD) |
|||
|- |
|||
! Base core clock ([[Hertz|MHz]]) |
|||
! Boost core clock ([[Hertz|MHz]]) |
|||
! Memory ([[Transfer (computing)|MT/s]]) |
|||
! Size ([[Gigabyte|GB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! Bus type |
|||
! Bus width ([[bit]]) |
|||
! Pixel ([[Pixel|GP]]/s) |
|||
! Texture ([[Texel (graphics)|GT]]/s) |
|||
! [[Single-precision floating-point format|Single precision]] |
|||
! [[Double-precision floating-point format|Double precision]] |
|||
! [[Half-precision floating-point format|Half precision]] |
|||
! [[Tensor]] compute (FP16) (2:1 sparse) |
|||
! Rays/s (Billions) |
|||
! RTX OPS/s (Trillions) |
|||
! Ray Perf [[TFLOPS]] |
|||
! MSRP |
|||
! Founders Edition |
|||
|- |
|||
! rowspan="2" |GeForce RTX 3050<ref>{{cite web |url=https://www.nvidia.com/fi-fi/geforce/graphics-cards/30-series/rtx-3050/ |access-date=2022-01-04 |title=NVIDIA GeForce RTX 3050 Graphics Card Announcement |website=NVIDIA |language=fi}}</ref> |
|||
|December 16, 2022 |
|||
|GA107-150-A1 |
|||
| rowspan="14" |[[Samsung Electronics|Samsung]]<br />[[10 nm process|8LPP]] |
|||
| |
|||
| |
|||
|2560:80:32:80:20 |
|||
|PCIe 4.0 x8 |
|||
| |
|||
|1552 |
|||
|1777 |
|||
|14000 |
|||
|8 |
|||
|224.0 |
|||
| rowspan="6" |[[GDDR6 SDRAM|GDDR6]] |
|||
|128 |
|||
| |
|||
| |
|||
| |
|||
| |
|||
| |
|||
| |
|||
| |
|||
| |
|||
| |
|||
|115 |
|||
| |
|||
|$249 |
|||
| |
|||
|- |
|||
| January 27, 2022<ref>{{cite web |url=https://mugens-reviews.de/builds/pc/neueste-nvidia-grafikkarten/ |access-date=2022-01-06 |title=NVIDIA Announces the GeForce RTX 30 Series |website=Mugens-Reviews |language=de}}</ref> |
|||
| GA106-150-A1 |
|||
| rowspan="3" |13.25 |
|||
| rowspan="3" |276 |
|||
| 2560:80:32:80:20<br>(20) (2) |
|||
| PCIe 4.0<br>x8 |
|||
| rowspan="2" | 2 |
|||
| 1552 |
|||
| 1777 |
|||
| 14000 |
|||
| rowspan="2" | 8 |
|||
| rowspan="2" | 224.0 |
|||
| rowspan="2" | 128 |
|||
| 49.6<br>59.86 |
|||
| 124.2<br>142.2 |
|||
| 7.95<br>9.01 |
|||
| 0.124<br>0.142 |
|||
| 7.95<br>9.01 |
|||
| 63.6<br>72.8 |
|||
| |
|||
| |
|||
| 18.2 |
|||
| 130 |
|||
| rowspan="11" {{No}} |
|||
| $249 |
|||
| rowspan="4" {{N/a}} |
|||
|- |
|||
! style="text-align:left;" rowspan="3" |GeForce RTX 3060<ref>{{cite web |url=https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3060/ |access-date=2021-01-12 |title=NVIDIA GeForce RTX 3060 Graphics Card Announcement |website=NVIDIA}}</ref> |
|||
|October 27, 2022<ref>{{Cite web |title=NVIDIA officially introduces GeForce RTX 3060 8GB and RTX 3060 Ti GDDR6X |url=https://videocardz.com/newz/nvidia-officially-introduces-geforce-rtx-3060-8gb-and-rtx-3060-ti-gddr6x |access-date=2022-10-29 |website=VideoCardz.com |language=en-US}}</ref> |
|||
|GA106-302 |
|||
| rowspan="3" |3584:112:48:112:28<br>(28) (3) |
|||
| rowspan="12" |PCIe 4.0<br>x16 |
|||
| rowspan="3" |1320 |
|||
| rowspan="3" |1777 |
|||
| rowspan="3" |15000 |
|||
| rowspan="3" |63.4<br>85.3 |
|||
| rowspan="3" |147.8<br>199.0 |
|||
| rowspan="3" |9.46<br>12.74 |
|||
| rowspan="3" |0.148<br>0.199 |
|||
| rowspan="3" |9.46<br>12.74 |
|||
| rowspan="3" |75.7<br>101.9 |
|||
| rowspan="3" | |
|||
| rowspan="3" | |
|||
| rowspan="3" |25 |
|||
| rowspan="3" |170 |
|||
| rowspan="3" |$329 |
|||
|- / *** 3060*** |
|||
| February 25, 2021 |
|||
|GA106-300-A1 |
|||
| rowspan="2" |3 |
|||
| rowspan="2" |12 |
|||
| rowspan="2" |360.0 |
|||
| rowspan="2" |192 |
|||
|- |
|||
| September 1, 2021 |
|||
| GA104-150-A1<ref>{{cite web|last=Mujtaba|first=Hassan|date=2021-09-25|title=Custom GALAX & Gainward GeForce RTX 3060 Cards With NVIDIA Ampere GA104 GPUs Listed|url=https://wccftech.com/custom-galax-gainward-geforce-rtx-3060-cards-with-nvidia-ampere-ga104-gpus-listed/|access-date=2021-09-25|website=Wccftech|language=en-US}}</ref> |
|||
| rowspan="5" |17.4 |
|||
| rowspan="5" |392.5 |
|||
|- 3060 / *** |
|||
! rowspan="2" style="text-align:left;" |GeForce RTX 3060 Ti<ref>{{cite web |url=https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3060-ti/ |access-date=2020-12-01 |title=NVIDIA GeForce RTX 3060 Ti Graphics Card}}</ref> |
|||
| December 2, 2020 |
|||
| GA104-200-A1 |
|||
| rowspan="2" | 4864:152:80:152:38<br>(38) (5) |
|||
| rowspan="4" |4 |
|||
| rowspan="2" | 1410 |
|||
| rowspan="2" | 1665 |
|||
|14000 |
|||
| rowspan="4" |8 |
|||
|448.0 |
|||
| rowspan="4" |256 |
|||
| rowspan="2" | 112.8<br>133.2 |
|||
| rowspan="2" | 214.3<br>253.1 |
|||
| rowspan="2" | 13.72<br>16.20 |
|||
| rowspan="2" | 0.214<br>0.253 |
|||
| rowspan="2" | 13.70<br>16.20 |
|||
| rowspan="2" | 109.7<br>129.6 |
|||
| |
|||
| |
|||
| rowspan="2" | 32.4 |
|||
| rowspan="3" | 200 |
|||
| colspan="2" rowspan="2" | $399 |
|||
|- |
|||
|October 27, 2022<ref>{{Cite web |title=NVIDIA officially introduces GeForce RTX 3060 8GB and RTX 3060 Ti GDDR6X |url=https://videocardz.com/newz/nvidia-officially-introduces-geforce-rtx-3060-8gb-and-rtx-3060-ti-gddr6x |access-date=2022-10-29 |website=VideoCardz.com |language=en-US}}</ref> |
|||
|GA104-202 |
|||
|19000 |
|||
|608.0 |
|||
|[[GDDR6 SDRAM#GDDR6X|GDDR6X]] |
|||
| |
|||
| |
|||
|- 3070 / *** |
|||
! style="text-align:left;" |GeForce RTX 3070<ref>{{cite web |url=https://www.nvidia.com/en-gb/geforce/graphics-cards/30-series/rtx-3070/ |access-date=2020-09-06 |title=NVIDIA GeForce RTX 3070 Graphics Card |website=NVIDIA}}</ref><ref name=":4">{{cite web|last=Smith|first=Ryan|date=September 1, 2020|title=NVIDIA Announces the GeForce RTX 30 Series: Ampere For Gaming, Starting With RTX 3080 & RTX 3090|url=https://www.anandtech.com/show/16057/nvidia-announces-the-geforce-rtx-30-series-ampere-for-gaming-starting-with-rtx-3080-rtx-3090|access-date=2020-09-02|website=AnandTech}}</ref> |
|||
| October 29, 2020<ref>{{cite web|url=https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3070-available-october-29/|title = GeForce RTX 3070 Availability Update|website=NVIDIA}}</ref> |
|||
| GA104-300-A1 |
|||
| 5888:184:96:184:46<br>(46) (6) |
|||
| 1500 |
|||
| 1725 |
|||
|14000 |
|||
|448.0 |
|||
|[[GDDR6 SDRAM|GDDR6]] |
|||
| 144.0<br>165.6 |
|||
| 276.0<br>317.4 |
|||
| 17.66<br>20.31 |
|||
| 0.276<br>0.318 |
|||
| 17.66<br>20.37 |
|||
| 141.31<br>162.98 |
|||
| |
|||
| |
|||
| 40.6 |
|||
| colspan=2 | $499 |
|||
|- /*** Ti***/ |
|||
! style="text-align:left;" |GeForce RTX 3070 Ti<ref>{{cite web |url=https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3070-3070ti/ |access-date=2021-06-02 |title=NVIDIA GeForce RTX 3070 Family|website=NVIDIA}}</ref> |
|||
| June 10, 2021 |
|||
| GA104-400-A1 |
|||
| 6144:192:96:192:48<br>(48) (6) |
|||
| 1575 |
|||
| 1770 |
|||
| rowspan="4" |19000 |
|||
| 608.3 |
|||
| rowspan="6" |[[GDDR6 SDRAM#GDDR6X|GDDR6X]] |
|||
| 151.18<br>169.9 |
|||
| 302.36<br>339.8 |
|||
| 19.35<br>21.75 |
|||
| 0.302<br>0.340 |
|||
| 19.35<br>21.75 |
|||
| 154.8<br>174.0 |
|||
| |
|||
| |
|||
| 43.5 |
|||
| 290 |
|||
| colspan=2 | $599 |
|||
|- /*** 3080 ***/ |
|||
! rowspan="2" style="text-align:left;" |GeForce RTX 3080<ref>{{cite web |url=https://www.nvidia.com/en-gb/geforce/graphics-cards/30-series/rtx-3080/ |access-date=2020-09-06 |title=NVIDIA GeForce RTX 3080 Graphics Card|website=NVIDIA}}</ref><ref name=":4">{{cite web|last=Smith|first=Ryan|date=September 1, 2020|title=NVIDIA Announces the GeForce RTX 30 Series: Ampere For Gaming, Starting With RTX 3080 & RTX 3090|url=https://www.anandtech.com/show/16057/nvidia-announces-the-geforce-rtx-30-series-ampere-for-gaming-starting-with-rtx-3080-rtx-3090|access-date=2020-09-02|website=AnandTech}}</ref> |
|||
| September 17, 2020 |
|||
| GA102-200-A1 |
|||
| rowspan="5" |28.3 |
|||
| rowspan="5" |628.4 |
|||
| 8704:272:96:272:68<br>(68) (6) |
|||
| 5 |
|||
| 1440 |
|||
| 1710 |
|||
| 10 |
|||
| 760.0 |
|||
| 320 |
|||
| 138.2<br>164.2 |
|||
| 391.68<br>465.12 |
|||
| 25.07<br>29.77 |
|||
| 0.392<br>0.465 |
|||
| 25.06<br>29.76 |
|||
| 200.54<br>238.14 |
|||
| |
|||
| |
|||
| 59.5 |
|||
| 320 |
|||
| colspan=2 | $699 |
|||
|- |
|||
| January 27, 2022 |
|||
| GA102-220-A1 |
|||
| 8960:280:96:280:70<br>(70) (6) |
|||
| rowspan="4" |6 |
|||
| 1260 |
|||
| 1710 |
|||
| rowspan="2" |12 |
|||
| rowspan="2" |912.0 |
|||
| rowspan="4" |384 |
|||
| 131.0<br>177.8 |
|||
| 352.8<br>478.8 |
|||
| 22.6<br>30.6 |
|||
| 0.353<br>0.479 |
|||
| 22.6<br>30.6 |
|||
| 180.6<br>245.1 |
|||
| |
|||
| |
|||
| 61.3 |
|||
| rowspan="3" | 350 |
|||
| colspan=2 | $799 |
|||
|- 3080 / *** |
|||
! style="text-align:left;" |GeForce RTX 3080 Ti<ref>{{cite web |url=https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3080-3080ti/ |access-date=2021-06-02 |title=NVIDIA GeForce RTX 3080 Family of Graphics Card |website=NVIDIA}}</ref> |
|||
| June 3, 2021 |
|||
| GA102-225-A1 |
|||
| 10240:320:112:320:80<br>(80) (7) |
|||
| 1365 |
|||
| 1665 |
|||
| 153.5<br>186.5 |
|||
| 438.5<br>532.8 |
|||
| 28.57<br>34.71 |
|||
| 0.438<br>0.533 |
|||
| 28.06<br>34.10 |
|||
| 228.6<br>272.8 |
|||
| |
|||
| |
|||
| 68.2 |
|||
| colspan=2 | $1199 |
|||
|- 3090 / *** |
|||
! style="text-align:left;" |GeForce RTX 3090<ref>{{cite web |url=https://www.nvidia.com/en-gb/geforce/graphics-cards/30-series/rtx-3090/ |access-date=2020-09-06 |title=NVIDIA GeForce RTX 3090 Graphics Card |website=NVIDIA}}</ref><ref name=":4">{{cite web|last=Smith|first=Ryan|date=September 1, 2020|title=NVIDIA Announces the GeForce RTX 30 Series: Ampere For Gaming, Starting With RTX 3080 & RTX 3090|url=https://www.anandtech.com/show/16057/nvidia-announces-the-geforce-rtx-30-series-ampere-for-gaming-starting-with-rtx-3080-rtx-3090|access-date=2020-09-02|website=AnandTech}}</ref> |
|||
| September 24, 2020 |
|||
| GA102-300-A1 |
|||
| 10496:328:112:328:82<br>(82) (7) |
|||
| 1395 |
|||
| 1695 |
|||
| 19500 |
|||
| rowspan="2" |24 |
|||
| 935.8 |
|||
| 156.2<br>189.8 |
|||
| 457.6<br>555.96 |
|||
| 29.28<br>35.58 |
|||
| 0.459<br>0.558 |
|||
| 29.38<br>35.68 |
|||
| 235.08<br>285.48 |
|||
| |
|||
| |
|||
| 71.1 |
|||
| rowspan="2" |2-way [[NVLink]] |
|||
| colspan=2 | $1499 |
|||
|- 3090 / *** |
|||
! style="text-align:left;" |GeForce RTX 3090 Ti<ref>{{cite web |title=GeForce RTX 3090 Ti Is Here: The Fastest GeForce GPU For The Most Demanding Creators & Gamers |url=https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3090-ti-out-now/ |access-date=2022-04-02 |website=NVIDIA |language=en-us}}</ref><ref>{{cite web |title=NVIDIA GeForce RTX 3090 Ti Specs |url=https://www.techpowerup.com/gpu-specs/geforce-rtx-3090-ti.c3829 |access-date=2022-04-02 |website=TechPowerUp |language=en}}</ref> |
|||
| March 29, 2022 |
|||
| GA102-350-A1 |
|||
| 10752:336:112:336:84<br>(84) (7) |
|||
| 1560 |
|||
| 1860 |
|||
| 21000 |
|||
| 1008.3 |
|||
| 174.7<br>208.3 |
|||
| 524.2<br>625 |
|||
| 33.5<br>40 |
|||
| 0.524<br>0.625 |
|||
| 33.5<br>40 |
|||
| 269.1<br>320.9 |
|||
| |
|||
| |
|||
| 79.9 |
|||
| 450 |
|||
| colspan=2 | $1999 |
|||
|} |
|||
{{notelist|refs= |
|||
{{efn|name=CoreConfig|Main [[Unified shader model|shader processors]] : [[texture mapping unit]] : [[render output unit]]s : [[tensor core]]s : [[Ray tracing (graphics)|ray-tracing]] cores (streaming multiprocessors) (graphics processing clusters)}} |
|||
}} |
|||
=== GeForce 40 series === |
|||
{{Further|GeForce 40 series}} |
|||
*Supported [[Application programming interface|APIs]]: [[Direct3D]] 12 Ultimate (12_2), [[OpenGL]] 4.6, [[OpenCL]] 3.0, [[Vulkan (API)|Vulkan]] 1.3 and [[CUDA]] 8.9<ref>{{Cite web |title=CUDA C++ Programming Guide |url=https://docs.nvidia.com/cuda/cuda-c-programming-guide/index.html |access-date=2022-09-20 |website=docs.nvidia.com |language=en-us}}</ref> |
|||
*Supported display connections: [[HDMI]] 2.1, [[DisplayPort]] 1.4a |
|||
*[[Tensor Core|Tensor core]] 4th gen |
|||
*[[RT core]] 3rd gen |
|||
*NVIDIA [[DLSS|DLSS 3]] |
|||
*No NVLink support, Multi-GPU over PCIe 5.0<ref>{{Cite web |last=published |first=Chuong Nguyen |date=2022-09-21 |title=Nvidia kills off NVLink on RTX 4090 |url=https://www.windowscentral.com/hardware/computers-desktops/nvidia-kills-off-nvlink-on-rtx-4090 |access-date=2023-01-01 |website=Windows Central |language=en}}</ref> |
|||
{| class="wikitable" style="font-size: 80%; text-align: center;" |
|||
! rowspan="2" | Model |
|||
! rowspan="2" | Launch |
|||
! rowspan="2" | [[Code name]] |
|||
! rowspan="2" | Process |
|||
! rowspan="2" | Transistors (billion) |
|||
! rowspan="2" | Die size (mm<sup>2</sup>) |
|||
! rowspan="2" | Core config{{efn|name="CoreConfig"}} |
|||
! rowspan="2" | [[Bus (computing)|Bus]] [[Input/output|interface]] |
|||
! rowspan="2" | [[GPU cache|L2 Cache]] ([[Megabyte|MB]]) |
|||
! colspan="3" | Clock speeds |
|||
! colspan="4" | Memory |
|||
! colspan="2" | [[Fillrate]] |
|||
! colspan="4" | Processing power ([[FLOPS|TFLOPS]]) |
|||
! colspan="3" | [[Ray tracing (graphics)|Ray-tracing]] Performance |
|||
! rowspan="2" | [[Thermal design power|TDP]] (Watts) |
|||
! colspan="2" | Release price (USD) |
|||
|- |
|||
! Base core clock ([[Hertz|MHz]]) |
|||
! Boost core clock ([[Hertz|MHz]]) |
|||
! Memory ([[Transfer (computing)|MT/s]]) |
|||
! Size ([[Gigabyte|GB]]) |
|||
! Bandwidth ([[Gigabyte|GB]]/s) |
|||
! Bus type |
|||
! Bus width ([[bit]]) |
|||
! Pixel ([[Pixel|GP]]/s) |
|||
! Texture ([[Texel (graphics)|GT]]/s) |
|||
! [[Single-precision floating-point format|Single precision]] |
|||
! [[Double-precision floating-point format|Double precision]] |
|||
! [[Half-precision floating-point format|Half precision]] |
|||
! [[Tensor]] compute (FP16) (2:1 sparse) |
|||
! Rays/s (Billions) |
|||
! RTX OPS/s (Trillions) |
|||
! Ray Perf [[FLOPS|TFLOPS]] |
|||
! MSRP |
|||
! Founders Edition |
|||
|- |
|||
! style="text-align:left;" | GeForce RTX 4070 Ti<ref>{{cite news |title=NVIDIA ‘accidentally’ confirms GeForce RTX 4070 Ti GPU specifications |url=https://videocardz.com/newz/nvidia-accidentally-confirms-geforce-rtx-4070-ti-gpu-specifications |access-date=2022-12-31 |agency=VideoCardz |date=2022-12-30}}</ref> |
|||
| January 5, 2023 |
|||
| rowspan="2" |AD104-400 |
|||
| rowspan="4" |[[TSMC]] [[5 nm process|N4]] |
|||
| rowspan="2" |35.8 |
|||
| rowspan="2" |294.5 |
|||
| rowspan="2" |7680:240:80:240:60<br />(60)(5) |
|||
| rowspan="4" |PCIe 4.0 x16 |
|||
| rowspan="2" |48 |
|||
| rowspan="2" |2310 |
|||
| rowspan="2" |2610 |
|||
| rowspan="2" |21000 |
|||
| rowspan="2" |12 |
|||
| rowspan="2" |504 |
|||
| rowspan="4" |[[GDDR6 SDRAM#GDDR6X|GDDR6X]] |
|||
| rowspan="2" |192 |
|||
| rowspan="2" |208.8 |
|||
| rowspan="2" |626.4 |
|||
| rowspan="2" |35.5<br />40.1 |
|||
| rowspan="2" |0.554<br />0.627 |
|||
| rowspan="2" |35.5<br />40.1 |
|||
| rowspan="2" |142 (284)<br />160 (321) |
|||
| rowspan="2" | |
|||
| rowspan="2" | |
|||
| rowspan="2" |92.7 |
|||
| rowspan="2" |285 |
|||
| $799 |
|||
| rowspan="2" {{N/A}} |
|||
|- |
|||
! rowspan="2" style="text-align:left;" |GeForce RTX 4080<ref>{{Cite web |title=NVIDIA GeForce RTX 4080 Graphics Cards |url=https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4080/ |access-date=2022-09-20 |website=NVIDIA |language=en-us}}</ref> |
|||
|{{TBA|Unlaunched}} <ref>{{Cite web |last=Warren |first=Tom |date=2022-10-14 |title=Nvidia says it's "unlaunching" the 12GB RTX 4080 after backlash |url=https://www.theverge.com/2022/10/14/23404595/nvidia-rtx-408-12gb-unlaunch |access-date=2022-10-14 |website=The Verge |language=en-US}}</ref><ref>{{Cite web |title=Unlaunching The 12GB 4080 |url=https://www.nvidia.com/en-us/geforce/news/12gb-4080-unlaunch/ |access-date=2022-10-14 |website=NVIDIA |language=en-us}}</ref> |
|||
| $899 |
|||
|- |
|||
| November 16, 2022 |
|||
| AD103-300 |
|||
| 45.9 |
|||
| 378.6 |
|||
| 9728:304:112:304:76<br />(76)(7) |
|||
| 64 |
|||
| 2210 |
|||
| 2505 |
|||
| 22400 |
|||
| 16 |
|||
| 717 |
|||
| 256 |
|||
| 280.6 |
|||
| 761.5 |
|||
| 43.0<br />48.8 |
|||
| 0.672<br />0.761 |
|||
| 43.0<br />48.8 |
|||
| 172 (344)<br />195 (390) |
|||
| |
|||
| |
|||
| 112.7 |
|||
| 320 |
|||
| colspan="2" | $1199 |
|||
|- |
|||
! style="text-align:left;" |GeForce RTX 4090<ref>{{Cite web |title=NVIDIA GeForce RTX 4090 Graphics Cards |url=https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4090/ |access-date=2022-09-20 |website=NVIDIA |language=en-us}}</ref><ref>{{Cite web |title=NVIDIA Ada GPU Architecture|url=https://images.nvidia.com/aem-dam/Solutions/geforce/ada/nvidia-ada-gpu-architecture.pdf|access-date=October 8, 2022}}</ref> |
|||
| October 12, 2022 |
|||
| AD102-300 |
|||
| 76.3 |
|||
| 608.5 |
|||
| 16384:512:176:512:128<br />(128)(11) |
|||
| 72 |
|||
| 2230 |
|||
| 2520 |
|||
| 21000 |
|||
| 24 |
|||
| 1008 |
|||
| 384 |
|||
| 443.5 |
|||
| 1290.2 |
|||
| 73.1<br />82.6 |
|||
| 1.142<br />1.291 |
|||
| 73.1<br />82.6 |
|||
| 292 (585)<br />330 (661) |
|||
| |
|||
| |
|||
| 191 |
|||
| 450 |
|||
| colspan="2" | $1599 |
|||
|} |
|||
{{notelist|refs= |
|||
{{efn|name="CoreConfig"|Main [[Unified shader model|shader processors]] : [[texture mapping unit]] : [[render output unit]]s : [[tensor core]]s : [[Ray tracing (graphics)|ray-tracing]] cores (streaming multiprocessors) (graphics processing clusters)}} |
|||
}} |
Latest revision as of 02:25, 21 February 2023
Officer ranks
[edit]Army
[edit]Navy
[edit]Marines
[edit]Air Force
[edit]Space Force
[edit]Enlisted and Other ranks
[edit]Army
[edit]Navy
[edit]Marines
[edit]Air Force
[edit]Space Force
[edit]See also
[edit]- List of comparative military ranks
- Ranks and insignia of NATO
- British Army officer rank insignia
- British Army other ranks rank insignia
- U.S. Army officer rank insignia
- U.S. Army enlisted rank insignia
- Military rank