15 Years Later: How The Duke strolled in and changed the video game market
Once upon a time, Japan was the center of the video-game universe. Nintendo’s Famicom poured gasoline on the video-game craze in Japan, and its American equivalent, the NES, revived the flagging U.S. gaming market after the Atari crash in the early 1980s. “Nintendo” became to video games what “Google” is to online searching, and through the era of Super Nintendo vs. Sega Genesis and Nintendo 64 vs. Sega Saturn vs. Sony PlayStation, console gaming skyrocketed to the forefront of pop culture worldwide.By the year 2000, all surviving consoles and handhelds came from Japanese firms, and many of the best games from the era came from Japanese companies. And with the arrival of the Sega Dreamcast and Sony PlayStation 2, coupled with Nintendo’s handheld strength with the Game Boy Advance, that era looked set to continue unabated for years to come.
It was into this Japanese-dominated market that Microsoft launched the Xbox almost 15 years ago, in November 2001. Looking back at the legacy of the Xbox, Xbox 360 and Xbox One, it’s impossible to ignore the impact Microsoft has had on the games market. And though the Xbox One may be second to Sony’s PlayStation 4 on unit sales, Microsoft already completed its goal in terms of how consoles are crafted and games are developed.
Start Me Up
The Xbox was the first American-made console to enter the market since the Atari Jaguar breathed its last. Considering the strength of the Japanese incumbents, it seemed crazy for Microsoft to enter that market, especially with the bulk of the company’s gaming hardware experience being a handful of Sidewinder-branded controllers and peripherals on PC. Microsoft was huge in the computer business as a software firm; the Windows operating system has been ubiquitous ever since the Rolling Stones ushered in Windows 95. Microsoft had developed games on PC and even console in the past as well, but had zero experience as a gaming platform owner. What led them to stepping forward into the gaming industry?
Programming interfaces and software development tools.
It may seem obvious in 2016, but the X in Xbox comes from DirectX, Microsoft’s family of gaming-oriented APIs. In the late ‘90s, Microsoft was drafted to provide a lightweight version of DirectX for Sega’s ill-fated Dreamcast console (hence the Windows CE branding on the system itself). This seeded the console gaming bug. A skunkworks team at Microsoft created a DirectX-centered gaming system out of PC hardware, and after a couple years of iteration, the Xbox was born.
The first true innovation of the Xbox was to base it on PC parts and DirectX, allowing developers to jump in with familiar tools — many of which they already had on-hand. Most earlier consoles were built on hardware that was not always unified. For example, CPUs and graphics processors may have been shared with other devices (the chip family that powered the Saturn and Dreamcast was also used by Japanese car manufacturers in their engine control computers, for example) but the coupling of hardware as a whole was not standardized. Parts were off the shelf, but their uses were not as aligned as PC parts tend to be now. While these tools may be powerful, they were bespoke; this inherently leads to specialization and differences between consoles.
But those American PC parts and libraries -- and its American-focused development, capped off by a debut at CES 2001 hosted by Bill Gates and The Rock -- led to a console that was super-sized, even by American standards. The monolithic box was massive but powerful, featuring Nvidia-powered graphics and a beefy, eight-gig hard drive, which helped usher in the era of DLC and patches for games.
Enter The Duke
Just as massive was the original controller design. Yes, the controller so big they had to create a smaller model for the Japanese market. Yes, the controller that became a meme before memes were weaponized. The controller now commonly referred to as The Duke (and how everyone settled on and passed along this nickname is one of my favorite things in gaming culture). Microsoft decided an American console needed an American-sized controller, and so the original Xbox Controller was born. Rumored to be code-named “Duke” by the Xbox team, it featured offset twin analog sticks, six buttons, and two triggers. These are traits which have survived through all three Xbox console generations. However, it fit those in a massive frame, featuring an enormous, cheap plastic bubble in the middle. A good idea, but executed poorly; beyond the controller being big, the face button placement and design was beyond poor. At some point, gaming magazines or websites used the code name as a nickname and it immediately stuck, living on to this day with gamers familiar with launch-era Xbox. A too-large controller was a big mistake to make early on, but was quickly altered; by the time I got an Xbox in 2002, the much more normally sized Controller S had put The Duke out to pasture.
Microsoft learned some valuable lessons from that initial Xbox system -- about console hardware, features to focus on, and software development -- which were brought into action with the Xbox 360. Despite having one of the goofiest announcement events (live on MTV!) and launch events ever (imagine the Best Buy games section out live at Burning Man) it became a hit system. Microsoft leveraged the confidence gained from its first system to take the necessary risks to topple Sony’s lead in the market -- to invest, to innovate, and to push the market forward.
SD to HD and Microsoft’s Ascent
The generation shift from Xbox to Xbox 360 matched almost perfectly with the start of the HD era, with mass-market penetration of HDTVs, consoles able to handle resolutions up to 1080p and -- most importantly -- the switchover in developed markets from standard-definition to HD content. Crucially, this generation shift was handled better by American and other western developers than it was by their Japanese peers. The billion-dollar question is: why?
It’s difficult to pinpoint an exact reason behind this rise of the West and decline of Japanese developers. Instead, it may be a case of death by a thousand, coincidentally timed cuts. And Microsoft was in the right position to capitalize on much of what impacted its Japanese competitors.
In Japan, the shift from 1990s to 2000s also coincided with the last echoes of the 1980s boom-time falling from earshot. The economic good times finally ended, cultural and demographic problems caught up with Japan, and the country entered a recession it’s never truly escaped from. Less a flu and more a permanent nagging cough, these economic issues still haunt Japan to this day, and impact the gaming market in many ways.
The generational shift to the first HD generation did not see the success of home consoles continue in Japan. The PlayStation 2 (155 million units sold worldwide, 21 million in Japan) and the Nintendo Wii (101 million worldwide, 12 million in Japan) were runaway hits. But since then, the Japanese console market has shrunk - the PS3 sold 6 million units, and the PS4 has only sold 3 million as of publication. This in a country of 128 million people, which again saw a peak of 21 million PS2s sold. Note that the PS2 and Wii are standard-definition consoles incapable of HD visuals.
Why did this happen? HDTVs aren’t foreign to Japan; gaming still remains popular. It’s just that console gaming as a mass-market prospect has been thoroughly replaced by mobile gaming. Beyond the success of the Nintendo DS and 3DS and PlayStation Portable, many game developers shifted into the smartphone game market. Economics may mean extra cash that would have gone to a console 10 years ago now goes to a new smartphone; after all, mobile devices are mandatory for 21st-century life.
But that still doesn’t get to the heart of the issue. Smartphones are also beyond mainstream in the U.S., where the console market remains incredibly strong. The slide of console gaming from the mainstream has as much to do with the Japanese gaming industry’s corporate culture as it does external economic issues, and in a strange way the industry problems were exacerbated by the Xbox and its architecture.
HD games require HD assets, which are a lot more costly to develop. Development teams and budgets swelled just as Japan entered and endured the Great Recession, creating a perfect storm. While many game players almost take the western indie scene for granted -- especially the pattern of developers leaving large studios to work independently on passion projects -- that scene has taken much longer to germinate in risk-averse Japan. Those cultural differences hold back Japanese developers to this day.
It also cannot be taken for granted the generational shift for developers who cut their teeth in the 1980s and made hay in the 1990s. By the time the Xbox launched, those developers from Japan’s early days of gaming were now managers at much larger businesses than they started in. The drain of talent into management roles, the aging of a generation of creators, or the fear of leaving what are now steady, salaried positions in their 40s and 50s that today's managers and developers may have risked in their 20s to become an independent developer once more -- all these subtle issues play a part.
One of the biggest reasons Western developers could mitigate the HD cost and time demands was in their development platforms of choice. From 2005, American and European developers quickly turned to middleware and off-the-shelf game engines while bespoke engines remained the territory of massive publishers (EA and its Frostbite tech) or the crazy (Kojima Productions and Fox Engine). Japanese devs, especially at the AAA level, have been held back by their insistence of using internally developed engines. Meanwhile, popular and accessible engines like Unity and Unreal have helped streamline the development process in the West at all levels. And guess what allowed those engines to be as powerful and flexible as they could be? The commonality of DirectX and X86 as a standard.
Changing the Game
15 years have gone by since the original Xbox was released. Personally, that’s almost half of my life; in gaming terms, it marks a halfway point between the Nintendo NES launch in North America and now. For Microsoft, this period stretches from a fledgling first generation, through a massive success, and into the present. And whether coincidental or not, their rise has coincided with huge changes to the industry, especially in Japan.
Did Microsoft intentionally kill the Japanese industry? I don’t believe so, I doubt Microsoft would say so publicly, and what higher ups believe privately is a matter of speculation. I think Microsoft took advantage of a tough situation (the shift to the HD era) made worse by economic and demographic realities sinking in.
I think Microsoft can look back proudly after 15 years and see how the console market has bent its way through strategic force, lucky bounces, or a combination of the two. From nothing, Microsoft is now one of two major players in the home console market, selling millions of systems around the world, producing award-winning titles, and pushing the limits of what a game console can do. Moreover, this has created a public, consumer product for a company that previously specialized in operating systems and office software. While it produced clunkers like The Duke, Blinx the Time Sweeper, and the sartorial foul that is a hoodie with a blazer, Microsoft has also ushered in Halo, Xbox Live, the rise of indie games, and an American platform holder into the previously Japanese-heavy business of the games market. Microsoft has created three full generations and played a large part in shifting the balance of power in gaming from the Land of the Rising Sun to the other side of the Pacific Ocean.