Tegra 3 Compared to Apple's A5X in Actual Benchmarks; Not as lop-sided as Apple said

dgstorm

Editor in Chief
Staff member
Premium Member
Joined
Dec 30, 2010
Messages
10,991
Reaction score
3,961
Location
Austin, TX
[video=youtube;TQlu39SIH6M]http://www.youtube.com/watch?v=TQlu39SIH6M&feature=player_detailpage[/video]​

Recently, Apple released their new iPad. It is an impressive device and will be a roaring success; however... they also made the wild claim that the A5X is four times faster than the Tegra 3, which had NVIDIA scoffing with a smirk. Now that the device has been released, the "proof is in the pudding" and Laptop Mag pitted the new iPad against a Transformer Prime to find out the veracity of Apple's claims. As it turns out, they are sorta right, from a certain point of view, but only if you bend your thinking in the most narrow way possible.

You see, it turns out that the graphics processor in the new A5X is faster than what is found in the Tegra 3 in OpenGL 3D benchmark. In fact, the A5X does render over four times as many texture pixels as the NVIDIA Tegra 3; however, this only translates into twice as many frames per second overall than the Tegra 3. So, ultimately, one small part of the processing power of the A5X is four times faster, but the actual total real world result is only twice as fast. Of course, we have no problem giving credit where credit is due, and the fact that it is twice as fast is a remarkable achievement from Apple. Still, this is only in one particular graphics benchmark.

Interestingly, in other benchmarks, the raw computational power of the Tegra 3 was vastly superior to the iPad. The Quad-Core chip in the T-Prime outperformed the iPad in integer, floating point, and memory performance. Finally, things were taken a step further to truly test real world performance, because everyone knows that benchmarking programs don't really show the whole picture. The two tabs were pitted against each other while playing Riptide GP and Shadowgun. In this, (closest to real-world testing), the two tabs were subjectively in a dead-heat. Apparently, the retina display on the iPad is so gorgeous it made the game look fantastic; however, there are some custom graphical flourishes like flowing water and billowing flags that are optimized for the Tegra 3, which make it stand out as well.

Ultimately, the truth in real-world applications tends to diminish all the marketing hype surrounding the release of a new device, and Apple's bold claims prove to be a bit over-hyped in this instance. Of course, this goes both ways, so who knows? When the next Tegra device comes forward, NVIDIA may be singing the same song then as well. Luckily, we the consumer are winners when these folks duke it out over who is the fastest and best!

Source: PhanDroid
 
I used to get excited about 15-20 percent increases in performance, but double the performance is simply incredible.
 
I'm sorry, but he had hte Prime in SuperIPS mode which is supposed to be used when viewing the tablet in direct sunlight. Its not supposed to be used indoors because its like looking at the sun. That alone will make the colors look better on the iPad. Also, the extra eye candy DOES matter. If it didn't, not only would the devs not add it, but everything would be 16 bit graphics still. It sounds like they had to take stuff out of the iOS version of the games because it was unsupported or slowed it down to a noticeable level. IMO, Apple loses in that category and Android wins hands down.
 
It's all semantics, but it's good to get what each side said correctly. Apple never said A5X offers four times the performance of Tegra3. They took the very narrow view that you said. Phil Shiller said that the GPU in A5X offers 4 times the performance of the one in Tegra3. And it does.

Plus, all the other tests are CPU tests haha, not very valid to compare a CPU in a quad core processor to a CPU in a dual core processor. If Tegra3 didn't win that, NVIDIA might as well go back to the drawing board.

It's a small distinction yes but the claim was that the A5X has 4 times the graphic processing power, and it does.

I'm still waiting for Anandtech...
 
It's all semantics, but it's good to get what each side said correctly. Apple never said A5X offers four times the performance of Tegra3. They took the very narrow view that you said. Phil Shiller said that the GPU in A5X offers 4 times the performance of the one in Tegra3. And it does.

Plus, all the other tests are CPU tests haha, not very valid to compare a CPU in a quad core processor to a CPU in a dual core processor. If Tegra3 didn't win that, NVIDIA might as well go back to the drawing board.

It's a small distinction yes but the claim was that the A5X has 4 times the graphic processing power, and it does.

I'm still waiting for Anandtech...

Actually, not to be contrary, but the article does not confirm that it is four times faster. The A5X doesn't have 4 times the performance. It renders 4 times the number of texture pixels in the OpenGL Benchmark. This only relates to how each polygon in a scene is effectively "painted on". The actual performance numbers in frames per second of the actual final benchmark was "only" twice as fast. Still very impressive, but not at all accurate to say that the graphics are 4 times faster. Only one small part of the graphics was four times faster in one benchmark. The actual gaming performance numbers turned out a bit different as well, as I stated in the article. Ultimately, the A5X's graphics are undoubtedly faster than the Tegra 3, but not as fast as Apple claimed, when you look at the "total factor".

Also, your statement about a quad-core processor necessarily being faster than a dual-core chip is not really accurate all the time either. During the time when Intel was moving from the Core2Duo to their quad-core solutions, there were several instances of chips that had a dual-core chip but were significantly faster than some of the lower end quad-core chips. This is partially due to differences in clock speed and internal cache. One example was the Wolfdale Core2Duo. That chip was blazingly fast, and when over-clocked could out-perform even some of the higher-end quad-core chips of that time. Of course, that changed later, but it was a good example for a while. Sometimes, efficiency improvements in older tech can be better than the cutting edge stuff that is brand new.

Anyhoo... I wasn't trying to slam your perspective, just offering additional clarity. :)

I agree with you on the last part... we really need to see Anandtech do a comprehensive comparison to get a complete picture. Regardless, the A5X is a remarkable achievement that is only over-shadowed by the Retina Display on the same device. That thing is gorgeous!
 
Actually, not to be contrary, but the article does not confirm that it is four times faster. The A5X doesn't have 4 times the performance. It renders 4 times the number of texture pixels in the OpenGL Benchmark. This only relates to how each polygon in a scene is effectively "painted on". The actual performance numbers in frames per second of the actual final benchmark was "only" twice as fast. Still very impressive, but not at all accurate to say that the graphics are 4 times faster. Only one small part of the graphics was four times faster in one benchmark. The actual gaming performance numbers turned out a bit different as well, as I stated in the article. Ultimately, the A5X's graphics are undoubtedly faster than the Tegra 3, but not as fast as Apple claimed, when you look at the "total factor".Also, your statement about a quad-core processor necessarily being faster than a dual-core chip is not really accurate all the time either. During the time when Intel was moving from the Core2Duo to their quad-core solutions, there were several instances of chips that had a dual-core chip but were significantly faster than some of the lower end quad-core chips. This is partially due to differences in clock speed and internal cache. One example was the Wolfdale Core2Duo. That chip was blazingly fast, and when over-clocked could out-perform even some of the higher-end quad-core chips of that time. Of course, that changed later, but it was a good example for a while. Sometimes, efficiency improvements in older tech can be better than the cutting edge stuff that is brand new.Anyhoo... I wasn't trying to slam your perspective, just offering additional clarity. :) I agree with you on the last part... we really need to see Anandtech do a comprehensive comparison to get a complete picture. Regardless, the A5X is a remarkable achievement that is only over-shadowed by the Retina Display on the same device. That thing is gorgeous!

Clarity is always appreciated haha. Apple's statement was definitely generic, and they definitely also did some "launch-speak" where they made some statements without backing it up. However, as I mentioned in the other thread about this, I didn't expect a consistent 4X all across the board. What apple did undoubtedly was run some controlled benchmark test, with the variables they set, saw that they achieved 4X GPU performance on one of those benchmarks and ran with "we have 4X the graphics performance". It would be impossible to duplicate the exact same test, under the exact same conditions, and they don't really have to. They can prove, and the tests above also prove that A5X can offer 4X the graphic performance. This is the only GPU test that was run:

GLBench-Fill-Test.jpg

The other tests are all CPU tests. 404,614,112 X 4 = 1,618,456,448. So it's not 4 times as fast, it's more than four times! And yes, it is speed and not just textels rendered. It's textels per second that they measured which is a "speed" benchmark. And again, it is the only graphics test that they ran, and A5X does indeed perform 4X better vs Tegra3 in GPU.

Phil Shiller definitely made a vague statement with no backup, but​ what he said wasn't a lie.

The second part of your post about dual core not necessarily outperforming quad core is also something I tend to disagree with. Maybe when quad core chips first came out vs an overclocked dual core chip, it wouldn't have been a gimme. But today's chips, I can't imagine any company with a quad core chip that is in danger of being defeated by dual core and being ok with that. It just shouldn't happen...I still stand by that if Nvidia didn't beat Apple on CPU performance, they might as well go back to the drawing board haha.Apple made no claims about CPU performance, they'd be foolish to do so haha, because Tegra3 absolutely will outperform it. However, graphics performance A5X is the winner by far more then 4X (based on the test run by LaptopMag). That's pretty damn impressive haha​
 
Indeed czerdrill! Speaking of clarity... thanks for adding more details to the benchmark results. That brings things into focus even more. Also, I understand your point better now about the dual-core vs. quad-core. During the first evolution of dual-core to quad-core chips with Intel, most of the software was not optimized to take advantage of quad-core... also, some of the lower end quad-core chips were simply not clocked as fast as the higher end dual-core chips, so you could get much better performance in some software, like games, with the last generation optimized Core2Duo. However, the newer evolution of dual-core smartphone chips to quad-core smartphone chips may not be directly comparable to those Intel chips of the past, so you are right that one would expect the newer quad-core smartphone chips to probably be faster.

Still, this isn't true in all instances, since if a particular app is only coded to take advantage of a dual-core chip and not a quad-core chip, then a higher clocked dual-core chip will outperform the quad-core with that piece of software. Extra cores don't always translate into extra speed if they end up being wasted. Regardless, it will be interesting to see what next chip both companies come up with to "trump" the competition. :)
 
Indeed czerdrill! Speaking of clarity... thanks for adding more details to the benchmark results. That brings things into focus even more. Also, I understand your point better now about the dual-core vs. quad-core. During the first evolution of dual-core to quad-core chips with Intel, most of the software was not optimized to take advantage of quad-core... also, some of the lower end quad-core chips were simply not clocked as fast as the higher end dual-core chips, so you could get much better performance in some software, like games, with the last generation optimized Core2Duo. However, the newer evolution of dual-core smartphone chips to quad-core smartphone chips may not be directly comparable to those Intel chips of the past, so you are right that one would expect the newer quad-core smartphone chips to probably be faster.

Still, this isn't true in all instances, since if a particular app is only coded to take advantage of a dual-core chip and not a quad-core chip, then a higher clocked dual-core chip will outperform the quad-core with that piece of software. Extra cores don't always translate into extra speed if they end up being wasted. Regardless, it will be interesting to see what next chip both companies come up with to "trump" the competition. :)

Agreed, it's only getting better. Look at this way, Apple blows Tegra3's GPU out of the water. But Tegra3 blows A5X's CPU out of the water in the Geekbench benchmark (which LaptopMag says measures "raw processing power rather then graphics"):

geekbench-score.jpg

I mean it's not even close, Tegra3 wins hands-down. So what does that say? It says both of these companies know what the heck they're doing and each one is going to try to trump the other where they're currently lacking. The winners? Us. haha​
 
Back
Top