Nintendo consoles

The Technology Behind Nintendo Consoles: 2001-2007

In recent years, Nintendo has been known more for the eccentricity of its hardware than anything else. From the motion controls of the Wii to the dual-screen gameplay of the Wii U and the portability of the Switch, much analysis has focused on the physical design of Nintendo’s hardware while ignoring the systems’ most critical fundamentals: its architectural design and its technology. As Sony and Microsoft have waged a fierce war over console specs over the past three generations, pitting machines against each other in an ever-evolving battle for supremacy, Nintendo has focused on the software.

It’s a shame, however. As a forerunner of the modern games industry and a talented computer hardware maker, Nintendo deserves a closer look at the technology behind its consoles and what makes them unique. In Part 2 of a three-part series, we dig deeper into Nintendo’s design choices for the esoteric GameCube and the groundbreaking Wii.

Find the first part of this series here.

Nintendo GameCube (GCN): Lost at Sea

Lick his wounds after a fifth generation beating at the hands of Sony, Nintendo went back to the drawing board and analyzed their mistakes. Their next decision was to figure out what was wrong with the N64 and avoid doing the same. fatal errors. While the SNES and the N64 had been a challenge for even the most seasoned programmers, Nintendo wanted to change that with its upcoming “Dolphin” gaming system. Announcement by Nintendo of America President Howard Lincoln in 1999, Dolphin was marketed as “33% above the projected performance data of [Sony’s] PlayStation 2 “and” easily twice as fast as [Sega’s] Dreamcast “ for “Quite cheap[,]bring power, as Shigeru Miyamoto said, “Manage all types of software developers interested in creating”.

IBM’s “Gekko” was Nintendo’s first foray into the PowerPC architecture. Credit: Wikipedia.

At the heart of this increased graphics capacity were IBM’s “Gekko” CPU and ATI’s “Flipper” GPU. Similar to IBM’s 64-bit PowerPC 750CXe, used in some Apple G3, Gekko was an incredibly energy efficient processor that was limited material by its shorter CPU pipelines at a maximum operating frequency of 485 MHz. Chosen for its cool operating temperature and small form factor (which matches the entire Nintendo design pattern along with the GameCube), Gekko was an inexpensive, easy-to-produce, and energy-efficient product: three key factors that allowed it to perform well with the GameCube’s real engine, ATI’s GPU Flipper.

After a brutal break with Silicon Graphics, Nintendo contracted graphics developer ArtX to develop Flipper in 1998. In 2000, however, ArtX was purchased by GPU developer ATI who officially shipped Flipper. Initially clocked at 200 MHz before being reduced to 162 MHz when Nintendo adjusted Gekko’s clock rate Shortly before its launch, Flipper was a powerful and efficient GPU that exploited several technologies (e.g. anti-aliasing) that were substantial advancements over what had been offered in previous generations of consoles.

Part of this advancement came from the way Flipper used his 3MB built-in memory. With more than half of the 51 million GPU transistors dedicated to this 3 MB arrangement (composed of a 2 MB Z-buffer and 1 MB texture cache), Nintendo relied heavily on Flipper’s built-in 1T-SRAM to achieve admirable results, even when compared to contemporary PCs. Although there were a few flaws in the overall chip design, including a using the same 1T-SRAM For system main memory, which showed ATI’s lack of involvement, Flipper ultimately proved to be a fast, cost-effective GPU with the powerful tools developers wanted.

Drummer GPU

“Flipper” was the powerful GPU at the heart of the GameCube. Credit: Wikipedia.

The GameCube was Nintendo’s first console with full support for 480p progressive scan component video. While the Xbox was missing Support HD output, the GameCube was capable of a much sharper picture than any Nintendo console before it. Unfortunately, the cables of the system components were built with a proprietary DAC (digital to analog converter) which made replication almost impossible for third parties at the time. There were also several expansion ports– two serial ports and one high-speed parallel port – which allowed for expansion in the future. Nintendo will eventually use serial ports for modem and Ethernet add-ons that added basic inline functionality and parallel port for the Game Boy Player, which made it possible to play Game Boy games of any generation on the GameCube. Nintendo would do remove second serial port and component video support wWhen they released a second GameCube model, the DOL-101, later in the console’s lifespan.

As with the N64, Nintendo’s main issue with the GameCube was the way it approached physical media. While the N64 opted for cartridges over CDs and caused a myriad of problems for developers in the process, Nintendo ultimately opted for an optical format for GameCube. However, still plagued by the hacking concerns that drove the decision to stick with cartridges when designing the N64, Nintendo decided to use a variant of the Mini DVD 8 cm as their standard. Compared to the PlayStation 2, whose discs contained between 4.7 and 8.5 GB, the GameCube mini discs could only contain 1.5 GB. While this was rarely a problem neither for proprietary titles, which based their games around the limitations present in the system, nor for cross-platform games, which generally better performed on Nintendo’s machine anyway, this was a problem as it prevented DVD playback.

As with the n64, Nintendo’s main issue with the GameCube was the way it approached physical media.

Despite their skill in designing all other aspects of the GameCube, Nintendo’s decision to leave out full DVD compatibility reduced the GameCube’s chances of success. The PlayStation 2 was published a year earlier and for $ 100 more, but had several key benefits including the ability to play full-size DVDs, a feature that three years ago had cost $ 599 on its own. For a second consecutive generation of consoles, Nintendo found itself faltering despite more advanced technology than its closest competitor.

Although the GameCube launched with a plethora of third-party media, it quickly evaporated and the company ended up with a miserable 21 million equipment sales, less than a third of what they achieved with the NES. In 2004, Nintendo was forced to come up with something different, to design a successor to the GameCube that did not compete on power but on something else entirely: innovation.

Nintendo Wii: the casual connection

With the GameCube clearly a failure in mid-2004, Nintendo decided to leave its failing console and chart its course for the next generation. Nintendo President Satoru Iwata began to give the first idea of ​​what was to come to GDC in 2005, offering tantalizing advice on Nintendo’s next console. Powered by IBM’s Broadway processor and ATI’s Hollywood GPU, the “Revolution” offer Backward compatibility, Wi-Fi support and would be developer friendly. Instead of focusing on raw power, like the GameCube had done, the Revolution would target $ 250, have a friendly for parents, price tag and go for a completely different path than other future generations, like that of Microsoft then to come Xbox 360.

Nintendo opted for a blue ocean strategy, targeting a market that hadn’t been targeted before: casual gamers who had never bought a console. Their choice of technology underlined this. The new from IBM Broadway CPU brought a increase clock frequency 729 MHz and ATI Hollywood GPU increased clock frequencies to 243 MHz. That, along with 512MB of NAND storage, native widescreen support, and a library of classic games, gave the Revolution a technical edge over the aging GameCube. However, the technology inside wasn’t its biggest advantage.

ATI chip

ATI’s “Hollywood” chip was interesting, but not the focal point with the Wii.

Instead, it relied on innovation. Unveiled at the end of 2005, the Controller of the Revolution turned heads. A singular, TV-remote-style, pointer-activated motion controller with few buttons on the face, it headlonged against the prevailing industry trend, going against convention and showing off the early signs of a new identity under Iwata who focused on a more family image. Instead of an analog stick on the main controller, it instead relied on the Nunchuck add-on device to provide proper analog input. Later came the name: Wii. The subject of a multitude of jokes, puns and memes after its reveal, the Wii had all the makings of a non-starter. If Nintendo had already competed for power and lost, what happened now that one of their few perks was gone?

While technologically leagues behind the PS3 and Xbox 360 and lacked appeal for die-hard gamers, the Wii has succeeded in one critical area: selling its appeal for pure and simple enjoyment. Of professional appraisers to soccer moms, he’s thrown rave reviews, sell several times during its early years and igniting the retail world by selling over 600,000 units in the United States in just over a week and an incomprehensible 101 million units in lifelong sales. The Wii marked the start of an incredible turnaround for Nintendo. After more than twenty-five years of declining hardware sales, they had finally succeeded in expanding their console market, connecting with casual gamers on a scale never seen before, or since.

Nintendo opted for a blue ocean strategy, chasing… casual gamers who had never bought a console.

Much like the NES, the technology in the Wii was not new or technically impressive, but incredibly innovative. However, unlike the NES, the Wii didn’t just combine old technology with good games, it broke new ground on How? ‘Or’ What to play these games, incorporating motion controls into Nintendo classics, such as Mario kart, and creating new classics, like Wii Sports, it showed just how flexible Nintendo can be.

However, like all successes, Nintendo’s adventure with the Wii was short-lived. In 2011 there were rumblings of Project cafe, a “Wii 2” which, for better or for worse, would decide Nintendo’s future …

Be sure to check out the last part of this series, as we discuss the technical history of the Wii U and the Switch.


The technology behind Nintendo’s consoles: 2012 to present

The Technology Behind Nintendo Consoles: 1983-1996

Leave a Reply

Your email address will not be published.