Editorial
GameCube vs. X-box GameCube vs. X-Box-a clash between two titans-who will win?
The Nintendo crafted GameCube? Or the large OS conglomerate's X-Box? These
debates have sprung up on message boards across the web, lunch table conflicts,
and many websites aboard. But still many are claiming X-Box the winner of it
all. Well if that’s your mindset you couldn’t be any further from the truth.
Lets start with what most people are thrown off by. Now when
you look at X-Box’s specs just about everything looks better, from RAM, MHz,
and Memory Bus Bandwidth, it exceeds GameCube. You may say "wow! X-Box
looks a lot more powerful than GameCube", but that’s where the line is
drawn-it just looks better. X-Box has a lot of power that’s a given, but
it’s nothing without control.
GameCube has hardware that’s a little dated when compared to
X-Box, but that’s okay. GameCube’s one finely tweaked machine. The system
architecture was designed to let the CPU/MPU, GPU, DSP, and RAM communicate, and
interact with one another in total un-adulterated harmony.
Knowing most people won’t take my word with so little
information lets start off with each consoles hardware development cycle shall
we?
GameCube: Above: Mario leaning on the International Business
Machine Logo (IBM)
X-Box: Now which company has the most experience, and the proper
materials to build a console specifically geared, and enhanced for gaming? I’m
hoping you said Nintendo. The only thing Microsoft has in their arsenal is a
steady flow of cash, and their ever-ominous presence to their competitors.
Nintendo has the one, and only Shigeru Miyamoto conferring with everybody from
controller design, to the people at ArtX making the GPU for what
developers/gamers want, and should expect from GCN.
Above: GameCube motherboard (it's A LOT smaller than
you think)
Now with all that GCN has the power where it counts. Here’s
a good example…X-box has 64MB’s of shared DDR RAM, GameCube has 24MB’s of
unified-standalone 1T-SRAM, along with 3MB’s on-chip, and a supplemental
16MB’s of SDRAM(AKA A-RAM("A" for Auxiliary)for a total of 43MB’s.
But there’s a huge flaw with X-Box, DDR RAM is some of the slowest chunkiest
RAM around. It’s double-data-rate, but that only boosts memory bus bandwidth (I'll
get to that later). Now the designers at Nintendo knew that to get the maximum
performance of the high-speed chipsets they’d need high speed RAM to boot. So
with a call to MoSys Nintendo was able to get a deal to use the fastest RAM on
the planet-MoSys 1T-SRAM. Calling it fast is a under statement. Its maximum
latency has been clocked at only 10ns! The texture RAM has been clocked at half
the speed 5ns! (Ns: Nanosecond 1-billionth of a second). This means that GCN
will be able to stream textures in and out requiring less memory, and putting
less strain on the CPU/GPU/MPU whose main job is to do calculations, and work
with special Fx instead of crunching numbers.
Above: IBM’s Gekko CPU (I told you it was small)
Memory Bus Bandwidth(MBB). That’s how fast the RAM will be
able to communicate with other parts of the system that use it. X-Box has a
maximum of 6.4GB/s of it. GameCube has 3.2GB/s. But X-Box’s is shared. Shared
you say? X-Box has 64MB’s of DDR RAM, and the developer can partion it for
whatever they need. For a very simple example a developer can set aside 20MB’s
as VRAM, 16 for sound, 20 for textures, 15 for animation, and 9 for the frame
buffer(this is far simpler than what is actually done. Most developers will need
over 20 portions). Now each of those portions will greedily take their share of
the MBB, thus reducing transfer speeds to under 1GB/s (if your lucky). What does
this mean to games? Well for the memory to be partioned the CPU’s going to
need to manage each of those. So instead of working with AI, physics, and
special effects the CPU crunching numbers. This can lead to slowdown, and since
the developers can’t stream enough data in, and out quick enough more RAM will
need to be used-taking away from textures, sound, animation, and the list goes
on.
Sorry for blabbering on the technical side there, but if you
understood that it’s a pretty good point.
I’ll to make a simpler point. X-Box is sort of the
"blaster" of the console world, and GCN the "lightsaber".
X-Box uses superfluous amounts of unnecessary resources to accomplish a task.
GCN is tweaked, accurate, and not a clumsy tool of justice. Get what I’m
saying?
Polygons you say? Ahh yet another mercifully fought debate that
has enraged for ages. Now lets go compare figures from both companies…
X-Box: 125M/sec Now note how on the official X-Box site there’s no
disclaimer attached to it. It’s obviously the maximum performance rating which
means absolutely nothing. It does not compensate for player interaction; the
polygons are dull, flat shaded, similarly shaped, textureless, and no special
effects engaged.
GameCube’s poly performance figures are noticeably low. But
in typical Nintendo fashion Yamauchi wanted to keep the hype down, and made sure
the poly performance wasn’t over the top. EA of Canada got they’re hands on
some development kits, and in February this year they had announced performance
figures of 22M/sec@60 FPS. Now cut the FPS rating in half, and you’ve doubled
the poly count. And 30FPS, with 44M polygons on the screen that’s nothing
short of awe inspiring.
Above: The ArtX developed GPU "Flipper"
X-Box hasn’t exactly been seen running off final production
hardware yet (which is supposedly in production as we speak). In fact it
hasn’t even been shown running off a development kit. So far the only thing
we’ve seen are what X-Box developers can do with PC’s with a little more
powerful than X-Box. And from what I’ve seen what developers have done on
PC’s more powerful than GCN, and X-Box, with 9 extra months of time, and
Microsoft funding haven’t been the light-years ahead of what was shown off at
Space World 2000, where developers at the most had 2months to put something together (in
fact the Rogue Squadron Cube demo was done in 20 days!). And all of those demos
ran off actual units, which have the possibility of becoming inferior, as there
are still some specifications to be released at E3 this May. So X-Box’s poly
figures are in league if not inferior to GCN’s performance rating.
Then finally another big issue-sound. Just recently Microsoft
announced that X-Box will feature Dolby Digital 5.1 in game with no hit to
performance. GameCube features the not much talked about Factor 5 MusyX sound
chip(AKA DSP). From what’s been released by the ever so quiet Nintendo is that
the MusyX chip is the most advanced sound processor of it’s kind. It has been
said to react to how you react in the game, and the overall visual atmosphere
your in. Right now Nintendo/Factor5 has yet to release any specifications
concerning outputs, but since the DVD movie version of GCN will support DD5.1,
it’s almost certain the game only version GCN will as well.
But that was all about sound output. Well I basically gave an
overview of the taboo MusyX chip, but that’s all there is to talk about. X-Box
doesn’t even have a DSP. Just like older consoles before it, X-Box will only
do pre-rendered repetitive scores of music. Not much innovation there (something
Nintendo believes is key in this new era).
Well I hope I was able to sum up the bulk of your concerns
with 1,660 words. Simply put GCN isn’t going to be a technical push over for
X-Box. GameCube has a fighting chance equal to if not better than the X-Box
juggernaut. I know battles will still be fought, but I just wanted to calm the
concerns of fellow Nintendoids-GameCube lives!
-Pretendo
PS: I know I’m biased and I’ve enraged a lot of X-Box
fanboys. So why not send your questions and/or comments to me? I’ll gladly
pick ones that aren’t full of rage and see to it they’re posted for all to
see.
By: Pretendo (GamecubeXL.com)
From what we know Nintendo has been developing GCN since maybe late 98-99’.
From there Nintendo has partnered up with many of the industries finest
corporations. From the $1,000,000,000 agreement with IBM, sound chip development
to Factor5, GPU design to ArtX, memory to MoSys, and the list goes on and on.
Now by letting companies who specialize in certain areas of the industry you can
build a finer product. Not to mention your taking billions of dollars of costs
and spreading them amongst many companies which is why GCN is expected to sell
under $200.
Actual X-Box hardware development probably began a little later than GCN, but
with couple extra-billion dollars to help them. So this gave Microsoft the
advantage of more advanced hardware. Now from what we know X-Box is just
collaboration between Microsoft (duh), Intel, and nVidia (wow there’s a lot of
console experience in that bunch). Intel’s providing the off the shelve PIII (which
Microsoft said would be a "modified" chip, but probably only has a few
new instruction sets to interact with the rest of the system). nVidia’s making
the GPU, which is pretty much based, off the GeForce 3’s chip architecture (which
like the PIII, only had a few more sets of instruction code for communication).
Microsoft is basically getting off the shelve PC components from other
manufacturers, from the 64MB’s of DDR RAM, the 10GB HDD, and the DVD-ROM. So
basically all Microsoft is doing are controller designs, legal deals, system
packaging, and sending boxes full of unmarked bills to anonymous sources.
GameCube: 6 million to 12 million polygons/second (Display capability assuming
actual game with complexity model, texture, etc.)
(I ripped these from the official sites)