Pipes Feed Preview: TechSpot Full Feed

  1. 50 Years Later: The Revolutionary 8008 Microprocessor

    2024-03-28 05:21:00 UTC

    <p class="byline">Guest author Ken Shirriff is a Silicon Valley-based computer enthusiast who enjoys reverse-engineering old chips and restoring classic equipment such as the Xerox Alto. Shirriff wrote the Arduino IRremote library for infrared remotes, attempted Bitcoin mining on a 60 year old IBM 1401 punch card mainframe, and got six symbols added to Unicode including the Bitcoin symbol. You can follow his work at <a href="https://www.righto.com/">Righto.com</a> and <a href="https://twitter.com/kenshirriff">@kenshirriff</a>.</p> <p>Intel's groundbreaking 8008 microprocessor was first produced over 50 years ago. This was Intel's first 8-bit microprocessor and the ancestor of the x86 processor family that you may be using right now. I couldn't find good die photos of the 8008, so I opened one up and took some detailed photographs. These new die photos are in this article, along with a discussion of the 8008's internal design.</p> <p>The photo below shows the tiny silicon die inside the 8008 package (click for higher resolution photo). You can barely see the wires and transistors that make up the chip. The squares around the outside are the 18 pads that are connected to the external pins by tiny bond wires.</p> <p style="text-align: center;"><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#8008-die-adjusted"><picture style="padding-bottom: calc(100% * 1274 / 1783)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/v-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/v-j_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/v-j.webp 1783w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="1274" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1783" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/v.jpg" srcset="https://www.techspot.com/articles-info/1397/images/v-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/v-j_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/v-j.webp 1783w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline" style="text-align: center;">Die photograph of the 8008 microprocessor</p> <p>You can see the text "8008" on the right edge of the chip and "© Intel 1971" on the lower edge. The initials HF appear on the top right for Hal Feeney, who did the chip's logic design and physical layout. Other key designers of the 8008 were Ted Hoff, Stan Mazor, and Federico Faggin.</p> <h3 id="Inside_the_chip">Inside the chip</h3> <p>The diagram below highlights some of the major functional blocks of the chip. On the left is the 8-bit Arithmetic/Logic Unit (ALU), which performs the actual data computations.<sup>3</sup></p> <p>The ALU uses two temporary registers to hold its input values. These registers take up significant area on the chip, not because they are complex, but because they need large transistors to drive signals through the ALU circuitry.</p> <p style="text-align: center;"><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-2"><picture style="padding-bottom: calc(100% * 764 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-2-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-2-j.webp 1335w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="764" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-2.jpg" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-2-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-2-j.webp 1335w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">Die of the 8008 microprocessor showing major components.</p> <p class="tsadinc">Below the registers is the carry look ahead circuitry. For addition and subtraction, this circuit computes all eight carry values in parallel to improve performance.<sup>2</sup> Since the low-order carry depends on just the low-order bits, while the higher-order carries depend on multiple bits, the circuit block has a triangular shape.</p> <p class="tsadinc">The triangular layout of the ALU is unusual. Most processors stack the circuitry for each bit into a regular rectangle (a bit-slice layout). The 8008, however, has eight blocks (one for each bit) arranged haphazardly to fit around the space left by the triangular carry generator. The ALU supports eight simple operations.<sup>3</sup></p> <p class="tsadinc">In the center of the chip is the instruction register and the instruction decoding logic that determines the meaning of each 8-bit machine instruction. Decoding is done with a <a href="https://en.wikipedia.org/wiki/Programmable_logic_array">Programmable Logic Array</a> (PLA), an arrangement of gates that matches bit patterns and generates the appropriate control signals for the rest of the chip. On the right are the storage blocks. The 8008's seven registers are in the upper right. In the lower right is the address stack, which consists of eight 14-bit address words. Unlike most processors, the 8008's call stack is stored on the chip instead of in memory. The program counter is just one of these addresses, making subroutine calls and returns very simple. The 8008 uses dynamic memory for this storage</p> <p>The physical structure of the chip is very close to the block diagram in the <a href="https://www.classiccmp.org/8008/8008UM.pdf">8008 User's Manual</a> (below), with blocks located on the chip in nearly the same positions as in the block diagram.</p> <p style="text-align: center;"><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image"><picture style="padding-bottom: calc(100% * 1318 / 2030)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-j_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-j.webp 2030w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="1318" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="2030" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image.jpg" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-j_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-j.webp 2030w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">Block diagram of the 8008 microprocessor, from the User's Manual.</p> <h3 id="The_structure_of_the">The structure of the chip</h3> <p>What does the die photo show? For our purposes, the chip can be thought of as three layers. The diagram below shows a closeup of the chip, pointing out these layers. The topmost layer is the metal wiring. It is the most visible feature, and looks metallic (not surprisingly). In the detail below, these wires are mostly horizontal. The polysilicon layer is below the metal and appears orange under the microscope.</p> <p><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-3"><picture style="padding-bottom: calc(100% * 797 / 1142)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-j.webp 1142w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="797" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1142" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3.jpg" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-j.webp 1142w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">A closeup of the 8008 die, showing the metal layer, the polysilicon, and the doped silicon.</p> <p class="tsadinc">The foundation of the chip is the silicon wafer, which appears purplish-gray in the photo. Pure silicon is effectively an insulator. Regions of it are "doped" with impurities to create semiconducting silicon. Being on the bottom, the silicon layer is difficult to distinguish, but you can see black lines along the border between doped silicon and undoped silicon. A few vertical silicon "wires" are visible in the photo.<sup>4</sup></p> <p class="tsadinc">Transistors are the key component of the chip, and a transistor is formed where a polysilicon wire crosses doped silicon. In the photo, the polysilicon appears as a brighter orange where it forms a transistor.</p> <h3 id="Why_an_18_pin">Why an 18 pin chip?</h3> <p class="tsadinc">One inconvenient feature of the 8008 is it only has 18 pins, which makes the chip slower and much more difficult to use. The 8008 uses 14 address bits and 8 data bits so with 18 pins there aren't enough pins for each signal. Instead, the chip has 8 data pins that are reused in three cycles to transmit the low address bits, high address bits, and data bits. A computer using the 8008 requires many support chips to interact with this inconvenient bus architecture.<sup>5</sup></p> <p class="tsadinc">There was no good reason to force the chip into 18 pins. Packages with 40 or 48 pins were common with other manufacturers, but 16 pins was "a religion at Intel".<sup>6</sup> Only with great reluctance did they move to 18 pins. By the time the 8080 processor came out a few years later, Intel had come to terms with 40-pin chips. The 8080 was much more popular, in part because it had a simpler bus design permitted by the 40-pin package.</p> <h3 id="Power_and_data_paths">Power and data paths in the chip</h3> <p>The data bus provides data flow through the chip. The diagram below shows the 8-bit data bus of the 8008 with rainbow colors for the 8 data lines. The data bus connects to the 8 data pins along the outside of the upper half of the chip. The bus runs between the ALU on the left, the instruction register (upper center), and the registers and stack on the right. The bus is split on the left with half along each side of the ALU.</p> <p><a href="https://www.techspot.com/articles-info/1397/images/die.webp" target="_blank"><picture style="padding-bottom: calc(100% * 1430 / 2000)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/die.webp 2000w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img alt height="1430" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="2000" class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/d." srcset="https://www.techspot.com/articles-info/1397/images/die.webp 2000w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">Die photo of the 8008 microprocessor. The power bus is shown in red and blue. The data bus is shown with 8 rainbow colors.</p> <p class="tsadinc">The red and blue lines show power routing. Power routing is an under-appreciated aspect of microprocessors. Power is routed in the metal layer due to its low resistance. But since there is only one metal layer in early microprocessors, power distribution must be carefully planned so the paths don't cross.<sup>7</sup> The diagram above shows Vcc lines in blue and Vdd lines in red. Power is supplied through the Vcc pin on the left and the Vdd pin on the right, then branches out into thin, interlocking wires that supply all parts of the chip.</p> <h3 id="The_register_file">The register file</h3> <p>To show what the chip looks like in detail, I've zoomed in on the 8008's register file in the photo below. The register file consists of an 8 by 7 grid of dynamic RAM (DRAM) storage cells, each using three transistors to hold one bit.<sup>8</sup> (You can see the transistors as the small rectangles where the orange polysilicon takes on a slightly more vivid color.) Each row is one of the 8008's seven 8-bit registers (A, B, C, D, E, H, L). On the left, you can see seven pairs of horizontal wires: the read select and write select lines for each register. At the top, you can see eight vertical wires to read or write the contents of each bit, along with 5 thicker wires to supply Vcc. Using DRAM for registers (rather than the more common static latches) is an interesting choice. Since Intel was primary a memory company at the time, I expect they chose DRAM due to their expertise in the area.</p> <p style="text-align: center;"><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-4"><picture style="padding-bottom: calc(100% * 844 / 776)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-4-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-4-j.webp 776w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="844" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="776" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-4.jpg" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-4-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-4-j.webp 776w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">The register file in the 8008. The chip has seven 8-bit registers: A, B, C, D, E, H, L</p> <h3 id="How_PMOS_works">How PMOS works</h3> <p class="tsadinc">The 8008 uses PMOS transistors. To simplify slightly, you can think of a PMOS transistor as a switch between two silicon wires, controlled by a gate input (of polysilicon). The switch closes when its gate input is low and it can pull its output high. If you're familiar with the NMOS transistors used in microprocessors like the 6502, PMOS may be a bit confusing because everything is backwards.</p> <p>A simple PMOS NAND gate can be constructed as shown below. When both inputs are high, the transistors are off and the resistor pulls the output low. When any input is low, the transistor will conduct, connecting the output to +5. Thus, the circuit implements a NAND gate. For compatibility with 5-volt TTL circuits, the PMOS gate (and thus the 8008) is powered with unusual voltages: -9V and +5V.</p> <p style="text-align: center;"><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image"><picture style="padding-bottom: 338px"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-p.webp 410w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="338" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="410" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image.png" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-p.webp 410w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">A NAND gate implemented with PMOS logic.</p> <p>For technical reasons, the resistor is actually implemented with a transistor. The diagram below shows how the transistor is wired to act as a pull-down resistor. The detail on the right shows how this circuit appears on the chip. The -9V metal wire is at the top, the transistor is in the middle, and the output is the silicon wire at the bottom.</p> <p style="text-align: center;"><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-2"><picture style="padding-bottom: 206px"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-2-p.webp 525w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="206" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="525" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-2.png" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-2-p.webp 525w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">In PMOS, a pull-down resistor (left) is implemented with a transistor (center). The photo on the right shows an actual pull-down in the 8008 microprocessor.</p> <h3 id="History_of_the_8008"><strong>History of the 8008</strong></h3> <p>The 8008's complicated story starts with the <a href="https://en.wikipedia.org/wiki/Datapoint_2200">Datapoint 2200</a>, a popular computer introduced in 1970 as a programmable terminal. (Some people consider the Datapoint 2200 to be the <a href="https://www.amazon.com/gp/product/B00AZDH71M/ref=as_li_tl?ie=UTF8&amp;camp=1789&amp;creative=9325&amp;creativeASIN=B00AZDH71M&amp;linkCode=as2&amp;tag=httpwwwtechsp-20&amp;linkId=f6eb255bcaefa83dd5de6ce7febfa297" class="norewrite" ref="sponsored" data-google-interstitial="false">first personal computer</a>.) Rather than using a microprocessor, the Datapoint 2200 contained a board-sized CPU build from individual TTL chips. (This was the standard way to build a CPU in the minicomputer era.) Datapoint and Intel decided that it would be possible to replace this board with a single MOS chip, and Intel started the 8008 project to build this chip. A bit later, Texas Instruments also agreed to build a single-chip processor for Datapoint. Both chips were designed to be compatible with the Datapoint 2200's 8-bit instruction set and architecture.</p> <p><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-3"><picture style="padding-bottom: calc(100% * 736 / 1430)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-p_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-p_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-p.webp 1430w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="736" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1430" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3.png" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-p_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-p_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-3-p.webp 1430w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">The 8008 processor was first described publicly in "Electronic Design", Oct 25, 1970. Although Intel claimed the chip would be delivered in January 1971, actual delivery was more than a year later in April, 1972.</p> <p class="tsadinc">Around March 1971, Texas Instruments completed their processor chip, calling it the <a href="https://www.righto.com/2015/05/the-texas-instruments-tmx-1795-first.html">TMC 1795</a>. After delaying the project, Intel finished the 8008 chip later, around the end of 1971. For a variety of reasons, Datapoint rejected both microprocessors and built a faster CPU based on newer TTL chips including the <a href="https://en.wikipedia.org/wiki/74181">74181 ALU chip</a>.</p> <p>TI tried unsuccessfully to market the TMC 1795 processor to companies such as Ford, but ended up abandoning the processor, focusing on highly-profitable calculator chips instead. Intel, on the other hand, marketed the 8008 as a general-purpose microprocessor, which eventually led to the x86 architecture you're probably using right now. Although TI was first with the 8-bit processor, it was Intel who made their chip a success, creating the microprocessor industry.</p> <p style="text-align: center;"><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-4"><picture style="padding-bottom: 221px"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-4-p.webp 505w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="221" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="505" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-4.png" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-4-p.webp 505w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">A family tree of the 8008 and some related processors. Black arrows indicate backwards compatibility. Light arrows indicate significant architecture changes.</p> <p class="tsadinc">The diagram above summarizes the "family tree" of the 8008 and some related processors.<sup>10</sup> The Datapoint 2200's architecture was used in the TMC 1795, the Intel 8008, and the next version Datapoint 2200<sup>11</sup>. Thus, four entirely different processors were built using the Datapoint 2200's instruction set and architecture. The Intel 8080 processor was a much-improved version of the 8008. It significantly extended the 8008's instruction set and reordered the machine code instructions for efficiency. The 8080 was used in groundbreaking early microcomputers such as the Altair and the Imsai. After working on the 4004 and 8080, designers Federico Faggin and Masatoshi Shima left Intel to build the Zilog Z-80 microprocessor, which improved on the 8080 and became very popular.</p> <p class="tsadinc">The jump to the 16-bit 8086 processor was much less evolutionary. Most 8080 assembly code could be converted to run on the 8086, but not trivially, as the instruction set and architecture were radically changed. Nonetheless, some characteristics of the Datapoint 2200 still exist in today's x86 processors. For instance, the Datapoint 2200 had a serial processor, processing bytes one bit at a time. Since the lowest bit needs to be processed first, the Datapoint 2200 was <a href="https://en.wikipedia.org/wiki/Endianness">little-endian</a>. For compatibility, the 8008 was little-endian, and this is still the case in Intel's processors. Another feature of the Datapoint 2200 was the parity flag, since parity calculation was important for a terminal's communication. The parity flag has continued to the x86 architecture.</p> <p class="tsadinc">The 8008 is architecturally unrelated to Intel's 4-bit 4004 processor<a href="https://www.righto.com/2016/12/die-photos-and-analysis-of_24.html#fn:i4004"><sup>12</sup></a>. The 8008 is not an 8-bit version of the 4-bit 4004 in any way. The similar names are purely a marketing invention; during its design phase the 8008 had the unexciting name "1201".</p> <p class="tsadinc">If you want more early microprocessor history, I wrote a detailed article for the <a href="https://spectrum.ieee.org/computing/hardware/the-surprising-story-of-the-first-microprocessors">IEEE Spectrum</a>. I also wrote a post about TI's <a href="https://www.righto.com/2015/05/the-texas-instruments-tmx-1795-first.html">TMC 1795</a>.</p> <h3 id="How_the_8008_fits">How the 8008 fits into the history of semiconductor technology</h3> <p class="tsadinc">The 4004 and 8008 both used silicon-gate enhancement-mode PMOS, a semiconductor technology that was only used briefly. This puts the chips at an interesting point in chip fabrication technology.</p> <p class="tsadinc">The 8008 (and modern processors) uses MOS transistors. These transistors had a long path to acceptance, being slower and less reliable than the bipolar transistors used in most computers of the 1960s. By the late 1960s, MOS integrated circuits were becoming more common; the standard technology was PMOS transistors with metal gates. The gates of the transistor consisted of metal, which was also used to connect components of the chip. Chips essentially had two layers of functionality: the silicon itself, and the metal wiring on top. This technology was used in many Texas Instruments calculator chips, as well as the TMC 1795 chip (the chip that had the same instruction set as the 8008).</p> <p>A key innovation that made the 8008 practical was the <a href="https://en.wikipedia.org/wiki/Self-aligned_gate">self-aligned gate</a> – a transistor using a gate of polysilicon rather than metal. Although this technology was invented by Fairchild and Bell Labs, it was Intel that pushed the technology ahead. Polysilicon gate transistors had much better performance than metal gate (for complex semiconductor reasons). In addition, adding a polysilicon layer made routing of signals in the chip much easier, making the chips denser. The diagram below shows the benefit of self-aligned gates: the metal-gate TMC 1795 is bigger than the 4004 and 8008 chips combined.</p> <p><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-5"><picture style="padding-bottom: calc(100% * 671 / 1200)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-5-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-5-j.webp 1200w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="671" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1200" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-5.jpg" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-5-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-5-j.webp 1200w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">Intel's 4004 and 8008 processors are much denser than Texas Instruments' TMC 1795 chip, largely due to their use of self-aligned gates.</p> <p class="tsadinc">Shortly afterwards, semiconductor technology improved again with the use of NMOS transistors instead of PMOS transistors. Although PMOS transistors were easier to manufacture initially, NMOS transistors are faster, so once NMOS could be fabricated reliably, they were a clear win.</p> <p class="tsadinc">NMOS led to more powerful chips such as the <a href="https://en.wikipedia.org/wiki/Intel_8080">Intel 8080</a> and the Motorola 6800 (both 1974). Another technology improvement of this time was ion-implantation to change the characteristics of transistors. This allowed the creation of "depletion-mode" transistors for use as pull-up resistors. These transistors improved chip performance and reduced power consumption. They also allowed the creation of chips that ran on standard five-volt supplies.<sup>13</sup></p> <p class="tsadinc">The combination of NMOS transistors and depletion-mode pull-ups was used for most of the microprocessors of the late 1970s and early 1980s, such as the 6502 (1975), Z-80 (1976), 68000 (1979), and Intel chips from the 8085 (1976) to the 80286 (1982).</p> <p class="tsadinc">In the mid 1980s, CMOS took over, using NMOS and PMOS transistors together to dramatically reduce power consumption, with chips such as the 80386 (1986), 68020 (1984) and <a href="https://www.righto.com/2015/12/reverse-engineering-arm1-ancestor-of.html">ARM1</a> (1985). Now almost all chips are CMOS.<sup>14</sup></p> <p class="tsadinc">As you can see, the 1970s were a time of large changes in semiconductor chip technology. The 4004 and 8008 were created when the technological capability intersected with the right market.</p> <h3 id="How_to_take_die">How to take die photos</h3> <p>In this section, I explain how I got the photos of the 8008 die. The first step is to open the chip package to expose the die. Most chips come in epoxy packages, which can be dissolved with <a href="https://zeptobars.ru/en/read/how-to-open-microchip-asic-what-inside">dangerous acids</a>.</p> <p><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-6"><picture style="padding-bottom: calc(100% * 562 / 1000)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-6-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-6-j.webp 1000w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="562" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1000" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-6.jpg" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-6-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-6-j.webp 1000w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">The 8008 microprocessor in a ceramic package</p> <p>Since I would rather avoid boiling nitric acid, I took a simpler approach. The 8008 is also available in a ceramic package (above), which I got on eBay. Tapping the chip along the seam with a chisel pops the two ceramic layers apart. The photo below shows the lower half of the ceramic package, with the die exposed. Most of the metal pins have been removed, but their positions in the package are visible. To the right of the die is a small square; this connects ground (Vcc) to the substrate. A couple of the tiny bond wires are still visible, connected to the die.</p> <p><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-7"><picture style="padding-bottom: calc(100% * 638 / 1500)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-7-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-7-j_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-7-j.webp 1500w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="638" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1500" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-7.jpg" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-7-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-7-j_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-7-j.webp 1500w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">Inside the package of the 8008 microprocessor, the silicon die is visible.</p> <p class="tsadinc">Once the die is exposed, a microscope can be used to take photographs. A standard microscope shines the light from below, which doesn't work well for die photographs. Instead, I used a <a href="https://www.amazon.com/gp/product/B004WMFOY4/ref=as_li_tl?ie=UTF8&amp;camp=1789&amp;creative=390957&amp;creativeASIN=B004WMFOY4&amp;linkCode=as2&amp;tag=httpwwwtechsp-20&amp;linkId=MNRLKEIWAYBRWVA6" class="norewrite" ref="sponsored" data-google-interstitial="false">metallurgical microscope</a>, which shines the light from above to illuminate the chip.</p> <p>I took 48 photographs through the microscope and then used the Hugin stitching software to combine them into one high-resolution image (<a href="https://www.righto.com/2015/12/creating-high-resolution-integrated.html">details</a>). Finally, I adjusted the image contrast to make the chip's structures more visible. The original image (which is approximately what you see through the microscope) is below for comparison.</p> <p><a target="_blank" href="/photos/article/1397-intel-8008-microprocessor/#2024-03-28-image-8"><picture style="padding-bottom: calc(100% * 3382 / 4730)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-8-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-8-j_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-8-j.webp 4730w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="3382" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="4730" alt class="b-lazy" src="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-8.jpg" srcset="https://www.techspot.com/articles-info/1397/images/2024-03-28-image-8-j_500.webp 500w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-8-j_1100.webp 1100w, https://www.techspot.com/articles-info/1397/images/2024-03-28-image-8-j.webp 4730w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="byline tsadinc" style="text-align: center;">Die photograph of the 8008 microprocessor</p> <h3 id="Conclusion">Conclusion</h3> <p class="tsadinc">I took detailed die photos of the 8008 that reveal the circuitry it used. While the 8008 wasn't the first microprocessor or even the first 8-bit microprocessor, it was truly revolutionary, triggering the microprocessor revolution and leading to the x86 architecture that dominated personal computers for decades to come. In future posts, I plan to explain the 8008's circuits in detail to provide a glimpse into the roots of today's computers.</p> <p class="tsadinc">I announce my latest blog posts on Twitter, so follow me at <a href="https://twitter.com/kenshirriff">kenshirriff</a>. Or you can use the <a href="https://www.righto.com/feeds/posts/default">RSS feed</a>.</p> <h4>Notes and references</h4> <p class="byline tsadinc">1. According to the <a href="https://archive.computerhistory.org/resources/access/text/2012/07/102657982-05-01-acc.pdf">oral history of the 8008</a>, photos of the 8008 were obtained in October / November 1971 (page 6). Chip designer Federico Faggin mentions that toward the end of 1971, "everything was working except for a few errors." Faggin then debugged a problem with the dynamic memory losing data, making it ready for production (page 9).</p> <p class="byline tsadinc">2. Using the carry look ahead circuit avoids the delay from a standard <a href="https://en.wikipedia.org/wiki/Adder_(electronics)#Ripple-carry_adder">ripple-carry adder</a>, where the carries propagate through the sum.</p> <p class="byline tsadinc">3. The 8008's ALU supports eight operations: add, subtract, add with carry, subtract with carry, AND, OR, XOR, and compare. It also implements left and right shift and rotate operations. The 8008 also has increment and decrement instructions, extending the <a href="https://www.textfiles.com/bitsavers/pdf/datapoint/2200/2200_Reference_Manual.pdf">Datapoint 2200's instruction set</a>.</p> <p class="byline tsadinc">4. Because silicon has higher resistance than polysilicon, most chips use the polysilicon and metal layers for wiring, not the silicon layer. The 4004 and 8008 chips are unusual in that they prefer to use the silicon layer for wiring rather than polysilicon. I expect this was due to the recent introduction of polysilicon: before polysilicon, routing needed to be done in the silicon layer and perhaps the chip designers were sticking with the older layout techniques.</p> <p class="byline tsadinc">5. The 8008 required 20 support chips according to <a href="https://www.electronicsweekly.com/blogs/mannerisms/yarns/after-the-4004-the-8008-and-80-2008-08/">chip architect Federico Faggin</a>. In contrast, the 4004 and earlier MOS computers such as the Four Phase and CADC were designed with a small number of MOS chips that worked together without extra "glue chips". In this sense, the 8008 was a step backwards architecturally, saying "here's the CPU, you figure out how to make a computer out of it."</p> <p class="byline tsadinc">6. For details on Intel's insistence on 16 pins, see <a href="https://archive.computerhistory.org/resources/text/Oral_History/Faggin_Federico/Faggin_Federico_1_2_3.oral_history.2004.102658025.pdf">Oral History of Federico Faggin</a>, page 55-56. It was only when the <a href="https://en.wikipedia.org/wiki/Intel_1103">1103 memory chip</a> required 18 pins that Intel reluctantly moved beyond 16 pins. And that was treated by Intel like "the sky had dropped from heaven," resulting in "so many long faces".</p> <p class="byline tsadinc">7. If two metal lines need to cross, one of them can be routed under the other by using the polysilicon layer. To be low resistance, this cross-under must be relatively wide, so cross-unders are avoided if possible.</p> <p class="byline tsadinc">8. The 8008 registers use the "3T1C" cell: three transistors and one capacitor (<a href="https://docencia.ac.upc.edu/master/MIRI/NCD/docs/04-Memory%20Structures-2.pdf">details</a>). The circuit doesn't physically contain a separate capacitor, but uses the gate capacitance of the transistor. One unusual feature of the 8008 cell is it uses one wire for both reading and writing the bit, while the typical 3T cell has separate wires for reading and writing. The 4004 had separate wires, but the design changed slightly in the 8008.</p> <p class="byline tsadinc">9. Pull-up resistors in later chips such as the 6502 were implemented using <a href="https://en.wikipedia.org/wiki/Depletion-load_NMOS_logic#Depletion-mode_transistors">depletion-mode NMOS transistors</a>. These yielded more faster, more efficient logic. They were also wired differently, with the gate connected to the output rather than the power rail.</p> <p class="byline tsadinc">10. The 8008 architecture and the evolution of Intel's microprocessors are discussed in detail in <a href="https://tcm.computerhistory.org/ComputerTimeline/Chap37_intel_CS2.pdf">Intel Microprocessors: 8008 to 8086</a>.</p> <p class="byline tsadinc">11. The second version of the Datapoint 2200 had a totally new implementation of the processor, still built from TTL chips. While the first version had a serial ALU (processing one bit at a time), the second version operated in parallel using 74181 ALU chips. As a result, the second version was much faster.</p> <p class="byline tsadinc">12. The extensive <a href="https://www.4004.com/">4004 Anniversary Project</a> has reverse-engineered the 4004 processor. The 4004 schematic is <a href="https://www.intel.com/Assets/PDF/General/4004_schematic.pdf">here</a>.</p> <p class="byline tsadinc">13. The Motorola 6800 microprocessor originally used enhancement-mode transistors. To operate off a single +5V supply, it had a <a href="https://www.google.com/patents/US3942047">voltage-doubler</a> circuit on the chip.</p> <p class="byline tsadinc">14. Interestingly, in 2007 Intel started using metal gates again in order to scale transistors further (<a href="https://www.eetimes.com/document.asp?doc_id=1281197">details</a>). In a way, semiconductor technology has gone full circle, back to metal gates, although now unusual metals such as hafnium are used.</p> <style type="text/css">article em { font-size: 0.85em; font-style: normal; padding: 20px; } sup {font-size: .75em; font-family:Helvetica;} </style> <style> .subDriveRevBot { margin: 30px 0 0px; border-radius: 3px; line-height: 1.5; font-size: 0.9em; color: #fff; background-color: #1d4d84; cursor: pointer; background-repeat: no-repeat; background-size: contain; background-position: right; } .subDriveRevBot:hover { background-color: #245996; transition: 0.4s linear; } .subDriveRevBot a { color: #fff; display: block; width: 100%; height: 100%; } .subDriveRevBot a:hover { color: #fff; } .subDriveRevBot .titlerr { background: rgba(30, 41, 51, 0.63); padding: 10px 20px 7px; color: #fff; letter-spacing: -0.1px; display: block; border-radius: 3px; font-size: 0.9em; } .subDriveRevBot .remark { font-weight: 500; color: #f29f26; font-family: Roboto; } .subDriveRevBot .remarknew { font-weight: 500; color: #fea; font-family: Roboto; } .subDriveRevBot .bulll { margin-bottom: 5px !important; padding: 15px 5px } </style> <div class="subDriveRevBot b-lazy" data-bg="https://www.techspot.com/images/elite/badge-elite7.png"> <a href="https://www.techspot.com/community/register/" target="nofollow"> <div class="titlerr"> If you enjoy our content, please consider subscribing. </div> <ul class="bulll"> <li>Ad-free TechSpot experience while supporting our work</li> <li> <span class="remark">Our promise:</span> All reader contributions will go toward funding more content</li> <li> <span class="remark">That means:</span> More tech features, more benchmarks and analysis</li> </ul> </a> </div>
  2. Nvidia GeForce RTX 3050 6GB Review

    2024-03-25 11:08:00 UTC

    <p>The GeForce RTX 3050 6GB is an interesting new release from Nvidia targeting the sub-$200 market. Now, the interesting part is not so much for its performance as it is for other factors which we'll detail in this introduction to give proper context on this GPU as we'll need to cover a few different angles.</p> <p>First, let's get the dodgy naming out of the way. The 6GB RTX 3050 isn't simply the original <a href="https://www.techspot.com/review/2505-geforce-rtx-3050-vs-radeon-6600/" target="_blank">RTX 3050 8GB</a> model with 2GB of VRAM removed. Instead, it's a significantly cut down product in almost every way. Whereas the original model, released two years ago, is based on the GA106 silicon, the new 6GB variant utilizes the GA107 which is almost 30% smaller.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#2024-03-15-image-11"><picture style="padding-bottom: calc(100% * 1440 / 2196)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-11-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-11-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-11-j.webp 2196w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="1440" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="2196" alt class="b-lazy" src="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-11.jpg" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-11-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-11-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-11-j.webp 2196w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p>This reduction results in a 10% decrease in core count, with two fewer SM units, leading to 10% fewer Tensor cores and RT cores. Furthermore, the cores are clocked 17% lower, reducing the boost from 1,777 MHz to just 1,470 MHz. These downgrades have led to a significant TDP reduction, from 130 watts to just 70 watts. This change is important as it means the RTX 3050 6GB doesn't require external power, a point we'll return to shortly.</p> <p>About that 6GB VRAM buffer. While the 8GB model employs a 128-bit wide memory bus, the 6GB version omits a 2GB memory chip, narrowing the bus to 96-bit and consequently reducing the memory bandwidth by 25% to 168 GB/s.</p> <p>In other words, although the 6GB and 8GB versions of the RTX 3050 share the same name, they are fundamentally different products, using entirely different silicon. Thus, potential buyers might think they are purchasing the same RTX 3050 with just 2GB of VRAM removed, when in fact, they are getting a substantially slower product.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#2024-03-15-image-5"><picture style="padding-bottom: calc(100% * 1440 / 2560)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-5-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-5-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-5-j.webp 2560w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" height="1440" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="2560" alt class="b-lazy" src="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-5.jpg" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-5-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-5-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-5-j.webp 2560w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="tsadinc">This misleading practice is clearly anti-consumer, and that alone spoils the 6GB RTX 3050 for us. It would be more appropriate and reasonable to have named this product the RTX 3040 or 3040 Ti, avoiding confusion and setting realistic expectations. So we're not giving Nvidia a pass here, the naming is atrocious. But it's not another <a href="https://www.techspot.com/review/1658-geforce-gt-1030-abomination/">GT 1030 DDR4 situation</a>, naming aside we might actually have a good product on our hands.</p> <p>One reason is pricing. The <a href="https://www.techspot.com/review/2403-nvidia-geforce-rtx-3050/" target="_blank">RTX 3050 8GB</a> was intended to be a $250 product, yet it rarely sold at this price and currently exceeds $300, severely undermining its value proposition. In contrast, the 6GB model has a suggested retail price of $170 and is presently available for $180, making it 40% less expensive than the 8GB model.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#2024-03-15-image-12"><picture style="padding-bottom: calc(100% * 1440 / 2560)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-12-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-12-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-12-j.webp 2560w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" alt height="1440" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="2560" class="b-lazy" src="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-12.jpg" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-12-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-12-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-12-j.webp 2560w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="tsadinc">Pricing looks decent, but another positive aspect is power consumption. With a TDP of just 70 watts, the 6GB RTX 3050 doesn't need external PCI Express power; all necessary power can be supplied through the PCIe slot. Although this might not seem significant to most, as connecting 6 or 8-pin PCIe power is straightforward and the difference between a 70 watt and 130 watt TDP is negligible for many, it becomes pivotal for specific scenarios.</p> <p class="tsadinc">For instance, those who have picked up a cheap Dell Optiplex or similar OEM system, will find the 6GB 3050 revolutionary. These office-type PCs often come with proprietary power supplies lacking PCIe power connectors, making an installation of a more powerful graphics cards like the Radeon RX 6600 quite tricky.</p> <p>Granted, this is niche use case and in most cases, it won't make sense. Moreover, the tech savvy amongst you will be able to get something like a <a href="https://www.techspot.com/review/2505-geforce-rtx-3050-vs-radeon-6600/" target="_blank">Radeon RX 6600</a> working in such a system, either by buying an ATX wiring adapter, so you can use a normal power supply, or by jerryrigging your own power supply to the graphics card, and there are several low cost and effective ways in which this can be done, sometimes even using a DC power pack that you might already have and there are some great YouTube videos on how to do this.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#2024-03-15-image-7"><picture style="padding-bottom: calc(100% * 1440 / 2560)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-7-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-7-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-7-j.webp 2560w" data-sizes="(max-width: 960px) 100vw, 680px"></source><img border="0" alt height="1440" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="2560" class="b-lazy" src="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-7.jpg" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-7-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-7-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-7-j.webp 2560w" sizes="(max-width: 960px) 100vw, 680px"></picture></a></p> <p class="tsadinc">While the 6GB RTX 3050 now holds the title of the world's fastest gaming graphics card that doesn't require external power, this distinction might not mean much to most. Nevertheless, we will consider this angle as we review the product.</p> <p class="tsadinc">What will ultimately matter to most is how the 6GB RTX 3050's performance at $180 compares with the slightly pricier Radeon RX 6600. There's also the new 8GB version of the <a href="https://www.techspot.com/review/2789-amd-radeon-6500-xt-revisit/" target="_blank">6500 XT</a> to consider. However, despite being significantly better than its 4GB counterpart, it remains an unattractive option compared to the RX 6600.</p> <p class="tsadinc">For testing, we're using our <a href="https://www.techspot.com/review/2783-ryzen-7800x3d-vs-core-i9-14900k/" target="_blank">Ryzen 7 7800X3D</a> test system equipped with 32GB of DDR5-6000 memory and the <a href="https://www.techspot.com/downloads/drivers/essentials/nvidia-geforce/" target="_blank">latest display drivers</a>. All data has been validated and updated for this review, with all GPUs tested at their base specification. In total, we've tested 13 games at 1080p using appropriate quality settings. Let's take a closer look at what we have…</p> <h2 id="Benchmarks">Benchmarks</h2> <p>First up, we have Remnant II, where the 6GB 3050 is just 14% slower than the original version. This is a significant margin for two products that share the same name, but based on the specs, one might have expected a larger gap between the two. It was slightly slower than the 8GB version of the <a href="https://www.techspot.com/review/2789-amd-radeon-6500-xt-revisit/" target="_blank">6500 XT</a> but also a notable 46% faster than the <a href="https://www.techspot.com/review/2456-amd-radeon-6400/" target="_blank">RX 6400</a>, AMD's fastest GPU that doesn't require external power.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#Remnant"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/Remnant-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Remnant-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/Remnant.png" srcset="https://www.techspot.com/articles-info/2816/bench/Remnant-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Remnant-p.webp 1335w" sizes="100vw"></picture></a></p> <p>If you're not concerned about external power, then trailing the <a href="https://www.techspot.com/review/2505-geforce-rtx-3050-vs-radeon-6600/" target="_blank">RX 6600</a> by a 35% margin will likely be problematic for the 6GB RTX 3050.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#TLOU"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/TLOU-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/TLOU-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/TLOU.png" srcset="https://www.techspot.com/articles-info/2816/bench/TLOU-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/TLOU-p.webp 1335w" sizes="100vw"></picture></a></p> <p>In tests with <a href="https://www.techspot.com/review/2656-the-last-of-us-gpu-benchmark/" target="_blank">The Last of Us Part I</a>, we again find the 6GB RTX 3050 not too far behind the 8GB model, showing a 14% slower performance. It also delivers twice the performance of the RX 6400, though it is almost 30% slower than the RX 6600.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#Starfield"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/Starfield-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Starfield-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/Starfield.png" srcset="https://www.techspot.com/articles-info/2816/bench/Starfield-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Starfield-p.webp 1335w" sizes="100vw"></picture></a></p> <p>In <a href="https://www.techspot.com/review/2731-starfield-gpu-benchmark/" target="_blank">Starfield</a>, the 6GB RTX 3050 averaged 36 fps using the lowest quality settings at 1080p, making it 16% slower than the <a href="https://www.techspot.com/review/2505-geforce-rtx-3050-vs-radeon-6600/" target="_blank">RTX 3050 8GB</a> but almost 40% faster than the RX 6400. Again, however, it was a little over 30% slower than the RX 6600.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#RE4"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/RE4-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/RE4-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/RE4.png" srcset="https://www.techspot.com/articles-info/2816/bench/RE4-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/RE4-p.webp 1335w" sizes="100vw"></picture></a></p> <p>Testing with Resident Evil 4, we find the 6GB version of the RTX 3050 to be 23% slower than the 8GB model, matching the performance of the 4GB 6500 XT. This made it 27% faster than the RX 6400. However, it was a significant 54% slower than the RX 6600, as the RDNA2 GPU performs very well in this title. This demonstrates instances where RDNA2 GPUs can match or outperform RDNA 3 GPUs in certain scenarios.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#RC"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/RC-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/RC-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/RC.png" srcset="https://www.techspot.com/articles-info/2816/bench/RC-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/RC-p.webp 1335w" sizes="100vw"></picture></a></p> <p>Next, we have Ratchet &amp; Clank: Rift Apart, where the 6GB RTX 3050 rendered 61 fps on average, making it 21% slower than the 8GB model and 41% slower than the RX 6600. Compared to the RX 6400, however, it was a massive 56% faster, offering a significant upgrade for those needing a GPU without external power.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#SWJS"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/SWJS-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/SWJS-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/SWJS.png" srcset="https://www.techspot.com/articles-info/2816/bench/SWJS-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/SWJS-p.webp 1335w" sizes="100vw"></picture></a></p> <p>Testing with Star Wars Jedi: Survivor, the 6GB model was 22% slower than the original, averaging just 45 fps, but still a substantial 32% faster than AMD's RX 6400. For those seeking the best value, and if external power is not a concern, the RX 6600 offers nearly 40% more performance in this title for roughly the same cost.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#Forza"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/Forza-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Forza-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/Forza.png" srcset="https://www.techspot.com/articles-info/2816/bench/Forza-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Forza-p.webp 1335w" sizes="100vw"></picture></a></p> <p>In Forza Motorsport, the performance between the 6GB and 8GB models was fairly similar, with the more affordable 6GB version trailing by 11%. This meant it was 62% faster than the RX 6400, a remarkable improvement, though still 24% slower than the RX 6600.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#CP2077"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/CP2077-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/CP2077-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/CP2077.png" srcset="https://www.techspot.com/articles-info/2816/bench/CP2077-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/CP2077-p.webp 1335w" sizes="100vw"></picture></a></p> <p><a href="https://www.techspot.com/review/2743-cyberpunk-phantom-liberty-benchmark/" target="_blank">Cyberpunk 2077: Phantom Liberty</a> saw the 6GB 3050 averaging 58 fps, 24% slower than the 8GB model, indicating a significant performance drop. Even so, it utterly surpassed the <a href="https://www.techspot.com/review/2513-intel-arc-a380-vs-amd-radeon-6400/" target="_blank">RX 6400</a>, offering 71% more performance. Yet, it was still 36% slower than the RX 6600, reinforcing the 6600 as the better value option if external power can be provided.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#Hogwarts"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/Hogwarts-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Hogwarts-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/Hogwarts.png" srcset="https://www.techspot.com/articles-info/2816/bench/Hogwarts-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Hogwarts-p.webp 1335w" sizes="100vw"></picture></a></p> <p>In <a href="https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/" target="_blank">Hogwarts Legacy</a>, the 6GB RTX 3050 performed on par with the 4GB 6500 XT, making it 20% slower than the 8GB version but still 31% faster than the RX 6400. However, without the need for a GPU without external power, the RX 6600 delivers almost 60% more performance for approximately the same price, making it the preferable choice depending on the user's requirements.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#Avatar"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/Avatar-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Avatar-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/Avatar.png" srcset="https://www.techspot.com/articles-info/2816/bench/Avatar-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Avatar-p.webp 1335w" sizes="100vw"></picture></a></p> <p>Testing with Avatar revealed the 6GB model to be 22% slower than the original, a considerable margin. Due to what appears to be ray tracing use even at the lowest quality settings, the RX 6400 significantly underperformed, making the 6GB RTX 3050 nearly twice as fast. Nonetheless, it was still 33% slower than the RX 6600.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#ACM"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/ACM-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/ACM-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/ACM.png" srcset="https://www.techspot.com/articles-info/2816/bench/ACM-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/ACM-p.webp 1335w" sizes="100vw"></picture></a></p> <p>In Assassin's Creed Mirage, the 6GB RTX 3050 averaged 70 fps, comparable to the <a href="https://www.techspot.com/review/2748-intel-arc-a580/" target="_blank">Arc A580</a> and <a href="https://www.techspot.com/review/1961-radeon-rx-5500-4gb-vs-8gb/" target="_blank">RX 5500 XT</a>. In this case, it was only 15% faster than the RX 6400, a modest margin relative to the significant advantages seen in almost every other title tested.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#AW2"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/AW2-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/AW2-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/AW2.png" srcset="https://www.techspot.com/articles-info/2816/bench/AW2-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/AW2-p.webp 1335w" sizes="100vw"></picture></a></p> <p>With Alan Wake 2, the 6GB RTX 3050 managed just 31 fps on average, not remarkable performance given the 1080p resolution and lowest quality settings, but still 55% faster than the RX 6400, despite being 23% slower than the 8GB model.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#CS2"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/CS2-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/CS2-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/CS2.png" srcset="https://www.techspot.com/articles-info/2816/bench/CS2-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/CS2-p.webp 1335w" sizes="100vw"></picture></a></p> <p class="tsadinc">Lastly, in Counter-Strike 2, the 6GB version was 15% slower than the original, averaging 178 fps, which is still excellent overall performance. This level of performance is comparable to the 6500 XT 4GB and even the older <a href="https://www.techspot.com/review/1961-radeon-rx-5500-4gb-vs-8gb/" target="_blank">5500 XT</a>, but amazingly, it still represents a 16% improvement over the RX 6400.</p> <h2 id="Average_Performance">Average Performance</h2> <p>Here's the 13-game average: the 6GB RTX 3050 was 18% slower than the 8GB model, a significant margin that is certainly problematic considering they share the same name. The 6GB model was also slightly slower than the 8GB 6500 XT but 25% faster than the 4GB version. As noted in a <a href="https://www.techspot.com/article/2815-vram-4gb-vs-8gb/">recent feature that looked at VRAM</a>, 4GB of VRAM is insufficient for modern gaming, even at 1080p using the lowest quality settings.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#Average"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/Average-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Average-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/Average.png" srcset="https://www.techspot.com/articles-info/2816/bench/Average-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Average-p.webp 1335w" sizes="100vw"></picture></a></p> <p class="tsadinc">Compared to AMD's fastest GPU that doesn't require external power, the RX 6400, the 6GB 3050 is a remarkable 50% faster, outperforming it significantly. However, AMD takes the lead for budget-conscious gamers who can manage external PCIe power connectors, as the <a href="https://www.techspot.com/review/2505-geforce-rtx-3050-vs-radeon-6600/" target="_blank">RX 6600</a> is 53% faster than the 6GB RTX 3050, likely offering much better value. Let's explore this further.</p> <h2 id="Cost_per_Frame">Cost per Frame</h2> <p class="tsadinc">As expected, the <a href="https://www.techspot.com/review/2505-geforce-rtx-3050-vs-radeon-6600/" target="_blank">RX 6600</a> retains its title as the best value entry-level gaming GPU, given that the 6GB RTX 3050 costs an excessive 38% more per frame. Unless required for a low-powered application, the new GeForce GPU presents poor value.</p> <p>One could argue that for those with the option for external PCIe power, the 8GB version of the 6500 XT also offers better value, costing 14% less per frame and being marginally faster. It also includes an extra 2GB of VRAM, but like the RX 6400, it is limited to just two display outputs, PCIe x4 bandwidth, and lacks hardware encoding.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#Cost"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/Cost-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Cost-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/Cost.png" srcset="https://www.techspot.com/articles-info/2816/bench/Cost-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Cost-p.webp 1335w" sizes="100vw"></picture></a></p> <p class="tsadinc">For us, the slight savings on the <a href="https://www.techspot.com/review/2789-amd-radeon-6500-xt-revisit/" target="_blank">8GB 6500 XT</a>, which is expected to be priced at $160, do not justify its purchase. With $160 intended as the starting price, any price above this renders it unattractive.</p> <p class="tsadinc">For low-powered applications, it's marginally better value than the RX 6400, but given that the Radeon GPU only averaged 40 fps, and the 6GB 3050 achieved 60 fps, we'd argue it offers much greater value than the cost per frame might suggest. The GeForce GPU also benefits from more VRAM, double the PCIe bandwidth, support for more than two display outputs, and includes hardware encoding.</p> <p class="tsadinc">Compared to the <a href="https://www.techspot.com/review/2505-geforce-rtx-3050-vs-radeon-6600/" target="_blank">8GB RTX 3050</a>, the new 6GB version is almost 30% better value per frame, marking a significant improvement. Although we disapprove of the naming strategy Nvidia has chosen, as it is consumer-unfriendly, the unexpected advantage here ends up being better value, even if it's noticeably slower. So it's difficult to go hard on them here, even if it is a bit dodgy.</p> <h2 id="Power_Consumption">Power Consumption</h2> <p>Here's a quick look at total system power draw. Interestingly, in <a href="https://www.techspot.com/article/2164-cyberpunk-benchmarks/" target="_blank">Cyberpunk 2077</a>, the 6GB RTX 3050 and the 4GB 6500 XT consumed a similar level of power, or at least the total system power consumption was similar. Consumption was also 20% higher than that of the RX 6400, but performance was also 71% greater. In terms of efficiency, the Radeon GPU gets annihilated, and so does the 4GB 6500 XT.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#Power1"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/Power1-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Power1-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/Power1.png" srcset="https://www.techspot.com/articles-info/2816/bench/Power1-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Power1-p.webp 1335w" sizes="100vw"></picture></a></p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#Power2"><picture style="padding-bottom: calc(100% * 1631 / 1335)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/bench/Power2-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Power2-p.webp 1335w" data-sizes="100vw"></source><img border="0" alt height="1631" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="1335" class="b-lazy" src="https://www.techspot.com/articles-info/2816/bench/Power2.png" srcset="https://www.techspot.com/articles-info/2816/bench/Power2-p_500.webp 500w, https://www.techspot.com/articles-info/2816/bench/Power2-p.webp 1335w" sizes="100vw"></picture></a></p> <p class="tsadinc">For the most part, we observed similar margins in Assassin's Creed Mirage. Again, the 6GB RTX 3050 had a similar power draw to the 6500 XT and was 6% faster in this title. Compared to the RX 6400, the power draw for the 6GB 3050 was 13% higher, but its performance was also 15% greater. So efficiency is significantly better for AMD in this title.</p> <h2 id="Temperatures">Temperatures</h2> <p>Now for a quick look at operating temperatures and clock speeds. For all our testing, we purchased the MSI Ventus 2x version of the 6GB RTX 3050. After an hour of load in an enclosed ATX case with an ambient room temperature of 21°C, we observed a peak hot-spot temperature of 72°C, which is acceptable, though the 2,000 RPM fan speed was a bit loud, and the card could be clearly heard over our case fans.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#2024-03-15-image-2"><picture style="padding-bottom: calc(100% * 1440 / 2560)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-2-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-2-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-2-j.webp 2560w" data-sizes="100vw"></source><img border="0" height="1440" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="2560" alt class="b-lazy" src="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-2.jpg" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-2-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-2-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-2-j.webp 2560w" sizes="100vw"></picture></a></p> <p class="tsadinc">In this test, we saw an average clock speed of 1,650 MHz, and the memory ran at the expected 1,750 MHz. Regarding overclocking, there's no headroom as you're limited by the power through the PCIe slot. The power slider is locked at 100%, and attempting to squeeze out an extra 100 MHz resulted in crashes.</p> <h2 id="What_We_Learned">What We Learned</h2> <p class="tsadinc">We've got mixed feelings about the RTX 3050 6GB. The naming choice is certainly problematic; calling this an RTX 3050 is very misleading. The <a href="https://www.techspot.com/review/2403-nvidia-geforce-rtx-3050/" target="_blank">RTX 3050</a> was released two years ago as a GA106 part, and its performance and specifications are well established.</p> <p>To introduce a GA107 part under the same name and muddy the waters is questionable. Of course, it's not only Nvidia engaging in funny buggers with product names; recently, we addressed <a href="https://www.techspot.com/review/2802-amd-ryzen-5700/">AMD's confusing naming</a> with the <a href="https://www.techspot.com/review/2802-amd-ryzen-5700/" target="_blank">Ryzen 7 5700</a>, which contrary to expectations is a <a href="https://www.techspot.com/review/2293-amd-ryzen-5700g/" target="_blank">5700G</a> with the iGPU disabled, making it a much slower part than its name suggests.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#2024-03-15-image-10"><picture style="padding-bottom: calc(100% * 1410 / 2506)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-10-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-10-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-10-j.webp 2506w" data-sizes="100vw"></source><img border="0" alt height="1410" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="2506" class="b-lazy" src="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-10.jpg" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-10-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-10-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-10-j.webp 2506w" sizes="100vw"></picture></a></p> <p class="tsadinc">We've witnessed various degrees of misleading product naming from Nvidia over the years. In the case of the 6GB RTX 3050, while it's one of the less severe instances and the product itself is quite good, it would have been more appropriate if it had been named the RTX 3040.</p> <p>Now, when we say it's a good product, generally speaking it's decent. To consider it good or great, you must be interested in its ability to operate without external power – a niche use case but one that exists. As a standard gaming product, it's passable but not exceptional.</p> <p><a target="_blank" href="/photos/article/2816-nvidia-geforce-rtx-3050-6gb/#2024-03-15-image"><picture style="padding-bottom: calc(100% * 1440 / 2560)"><source type="image/webp" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-j.webp 2560w" data-sizes="100vw"></source><img border="0" alt height="1440" old-src="data:image/svg+xml,%3Csvg%20xmlns='http://www.w3.org/2000/svg'%20viewBox='0%200%2016%209'%3E%3C/svg%3E" width="2560" class="b-lazy" src="https://www.techspot.com/articles-info/2816/images/2024-03-15-image.jpg" srcset="https://www.techspot.com/articles-info/2816/images/2024-03-15-image-j_500.webp 500w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-j_1100.webp 1100w, https://www.techspot.com/articles-info/2816/images/2024-03-15-image-j.webp 2560w" sizes="100vw"></picture></a></p> <p class="tsadinc">For just $20 more, the <a href="https://www.techspot.com/review/2505-geforce-rtx-3050-vs-radeon-6600/" target="_blank">Radeon RX 6600</a> offers significantly better value, delivering over 50% more performance on average, more VRAM, and a superior gaming experience overall, even allowing for 1440p gaming.</p> <p class="tsadinc">So there you have it, the RTX 3050 6GB isn't as outrageous as we had anticipated. What are your thoughts on this GPU? Feel free to share your opinions in the comments below.</p> <h5>Shopping Shortcuts:</h5> <ul> <li>Nvidia GeForce RTX 3050 6GB on <a href="https://www.amazon.com/ASUS-NVIDIA-GeForce-Gaming-Graphics/dp/B0CVCG2VPK/?tag=httpwwwtechsp-20" target="_blank" class="norewrite" ref="sponsored" data-google-interstitial="false">Amazon</a> </li> <li>AMD Radeon RX 6600 on <a href="https://www.amazon.com/PowerColor-Fighter-Radeon-6600-Graphics/dp/B09H3PY14M/?tag=httpwwwtechsp-20" target="_blank" class="norewrite" ref="sponsored" data-google-interstitial="false">Amazon</a> </li> <li>Nvidia GeForce RTX 4060 Ti on <a href="https://www.amazon.com/s?k=Nvidia+GeForce+RTX+4060+Ti&amp;crid=32W8V67AD9RUH&amp;sprefix=nvidia+geforce+rtx+4070%2Caps%2C151&amp;ref=nb_sb_noss_2&amp;tag=httpwwwtechsp-20" target="_blank" class="norewrite" ref="sponsored" data-google-interstitial="false">Amazon</a> </li> <li>AMD Radeon RX 7800 XT on <a href="https://www.amazon.com/s?k=AMD+Radeon+RX+7800+XT&amp;crid=1A60EKHJ7GQUD&amp;sprefix=amd+radeon+rx+7800+xt%2Caps%2C476&amp;ref=nb_sb_noss_1&amp;tag=httpwwwtechsp-20" target="_blank" class="norewrite" ref="sponsored" data-google-interstitial="false">Amazon</a> </li> <li>Nvidia GeForce RTX 4070 Super on <a href="https://www.amazon.com/s?k=Nvidia+GeForce+RTX+4070+super&amp;crid=1DLBYVRVN8RKC&amp;sprefix=nvidia+geforce+rtx+4070+super%2Caps%2C183&amp;ref=nb_sb_noss_1&amp;tag=httpwwwtechsp-20" target="_blank" class="norewrite" ref="sponsored" data-google-interstitial="false">Amazon</a> </li> <li>AMD Radeon RX 7900 XT on <a href="https://www.amazon.com/s?k=AMD+Radeon+RX+7900+XT&amp;ref=nb_sb_noss&amp;tag=httpwwwtechsp-20" target="_blank" class="norewrite" ref="sponsored" data-google-interstitial="false">Amazon</a> </li> <li>Nvidia GeForce RTX 4080 Super on <a href="https://www.amazon.com/s?k=Nvidia+GeForce+RTX+4080+Super&amp;crid=25N1WBQI0VEY7&amp;sprefix=%2Caps%2C396&amp;ref=nb_sb_noss_2&amp;tag=httpwwwtechsp-20" target="_blank" class="norewrite" ref="sponsored" data-google-interstitial="false">Amazon</a> </li> </ul> <style> .subDriveRevBot { margin: 30px 0 0px; border-radius: 3px; line-height: 1.5; font-size: 0.9em; color: #fff; background-color: #1d4d84; cursor: pointer; background-repeat: no-repeat; background-size: contain; background-position: right; } .subDriveRevBot:hover { background-color: #245996; transition: 0.4s linear; } .subDriveRevBot a { color: #fff; display: block; width: 100%; height: 100%; } .subDriveRevBot a:hover { color: #fff; } .subDriveRevBot .titlerr { background: rgba(30, 41, 51, 0.63); padding: 10px 20px 7px; color: #fff; letter-spacing: -0.1px; display: block; border-radius: 3px; font-size: 0.9em; } .subDriveRevBot .remark { font-weight: 500; color: #f29f26; font-family: Roboto; } .subDriveRevBot .remarknew { font-weight: 500; color: #fea; font-family: Roboto; } .subDriveRevBot .bulll { margin-bottom: 5px !important; padding: 15px 5px } </style> <div class="subDriveRevBot b-lazy" data-bg="https://www.techspot.com/images/elite/badge-elite7.png"> <a href="https://www.techspot.com/community/register/" target="nofollow"> <div class="titlerr"> If you enjoy our content, please consider subscribing. </div> <ul class="bulll"> <li>Ad-free TechSpot experience while supporting our work</li> <li> <span class="remark">Our promise:</span> All reader contributions will go toward funding more content</li> <li> <span class="remark">That means:</span> More tech features, more benchmarks and analysis</li> </ul> </a> </div>