Galaxy S IV, New Mali GPU. - AT&T Samsung Galaxy S II SGH-I777

After being embarrassed by the iPhone's PowerVX SGX543 GPU, I do hope Samsung regains the crown in performance and see this with the new MALI GPUs slated for the Galaxy S Line.
What's probably going to be used in the Galaxy S III?
The Mali T604
The Mali-T604 was announced last year and it's the first implementation of ARM's new Midgard architecture. The T604 appears to be ARM's first unified shader architecture. Each T604 core is a combination of two arithmetic pipes and one texture pipe, although the width and capabilities of each are unknown. Like the Mali-400, the Mali-T604 will be available in 1 - 4 core configurations. The first T604 based SoCs will be available in the second half of 2012 on 28/32nm silicon. ARM is promising up to 68 GFLOPS of compute from T604 (presumably that's for a 4-core configuration at high clocks.
Click to expand...
Click to collapse
But that's not the Juicy part.. what I'm excited for is the GALAXY S IV!!! (If we're all still alive by that time)
It boasts the MALI-T658 GPU!
The T658 is a second generation Midgard implementation with twice the arithmetic pipes per core compared to the T604. ARM also enables up to 8-core configurations with T658. We'll see the first T658 implementations on 28/32nm sometime in 2013. It's unclear what other architectural changes have been made compared to the T604, but at bare minimum we can hope for a doubling of execution resources. ARM is promising up to a 10x increase in performance compared to "mainstream" Mali-400 implementations (perhaps single-core Mali-400).
Click to expand...
Click to collapse
Did you see that? 8 freaking cores!
One can just imagine what tech we have by then.
Probably the following:
Super Amoled + HD
1.5 ghz Quad-core
Octo-core GPU
My God.

Only 28nm? TSMC has had sucessful batches of 22nm.

Guess we're skipping III.

Related

[Q] why i see ARMV7 and not ARM Cortex-A9?

The CPU in the atrix is Dual-core ARM Cortex-A9 CPU, so why i see ARMV7?
it's beacuse that the andorid 2.2 not support in dual core?
The atrix will use the dual core just in 2.3 andorid ?
http://en.wikipedia.org/wiki/ARM_Cortex-A9_MPCore
Cortex-A9 uses ARMv7.
I'm not sure Android 2.2 uses dual core but apps that use dual core can still use both cores even on 2.2. (AFAIK)
wtf???
wtf?? the nvidea already develop the other processors?
Tegra (Kal-El) series
Processor: Quad-core ARM Cortex-A9 MPCore, up to 1.5 GHz
Ultra Low Power CPU mode
40 nm by TSMC
Video output up to 2560x1600
NEON instruction sets from ARM
1080p H.264 High Profile video decode
12-Core Nvidia GPU with support for 3D stereo (12 unified shaders); about 5 times faster than Tegra 2[14]
To be released in the second half of 2011
[edit]Tegra (Wayne) series
Processor: Quad-core ARM Cortex-A15 MPCore ?
28 nm by TSMC ?
Improved GPU core: about 10 times faster than Tegra 2
To be released in 2012
[edit]Tegra (Logan) series
Processor: ARM ?
Improved GPU core: about 50 times faster than Tegra 2
To be released in 2013
[edit]Tegra (Stark) series
Processor: ARM ?
Improved GPU core: about 100 times faster than Tegra 2
To be released in 2014
thats ****ing nutz.....nvidea solt tegra 2 but of what I see they already have develop another processor ...nvidea don't realease the processor because they want do more money and a controled evolution......:S thats sucks...I want the last ARM power by tegra :'(
Every <THING> company that wants to survive is always working on the next-gen <THING> technology that is more advanced than their current <THING> technology.
Oh noes Toyota already has a new Prius for next year but wont sell it yet. And Apple already has a new MacBook but won't sell it yet. OMG coach has a new style bag and they are nit selling it yet.
Every company does this. Every company wants you to buy things now and then buy something else later. That is how companies make money.
Sent from my MB860 using Tapatalk
Its a programmed obsolesency
Sent from my MB860 using XDA App

Evo 3D's asynchronous dual core?

I was just thinking about something. Is it really a fair comparison between an asynchronous dual core and a conventional dual core such as the Tegra or the OMAP4? We all know how everyone loves to compare benchmarks on phones. Also, we all know that the 3d does horrible on Quadrant scores. Is this because of the type of cpu we have? If it is... Is it really fair to even try to compare them?
My thinking is that, if both of our cores ran at the same speed all of the time, our cpu would dominate everything on benchmarks. Am I wrong in thinking that? Is there any way we would truly know?
Ps. Hope this isn't dumb thinking. If it is, please just state why and move on. I am NOT trying to start any flame war or troll thread. This is a 100% completely sincere question.
Thanks in advance!
Sent by my supercharged dual core from the 3rd dimension.
Benchmark scores mean **** anyways. I don't know why people insist on using them. If the phone runs well, it runs well
Tad slower mostly because its based on a similar ARM cortex A8 design. Those other ones, like galaxy s2 or other SOC's are based on the newer cortex A9 designs. Been analyzed several times over anandtech or other sites. Besides those benchmarks are not dual core at all. So we are apples to apples. Difference is in designs. If you compare two cpus clocked at same speeds (snapdragon/A8 vs A9) A9 will come ahead.
Sent from my PG86100 using XDA App
I understand that benchmarks don't mean anything. I just want to know if the fact that our cpu is asynchronous had anything to do with the exceptionally low scores compared to other devices.
Sent from my PG86100 using XDA App
I'd chalk it up to the fact that the most recent OMAP and Exynos are based on A9 while our scorpion cores are heavily modified A8 designs by qualcomm.
Ours are in between A8 and A9.
Sent from my PG86100 using xda premium
I briefly und understand the difference between A9 and A8 based chips but I personally think the current snapdragon in the shooter (msm8660?) is a much superior chip then the tegra 2. I got tiered of my og evo so I bought the shooter off contract from a buddy for cheap and plan to get the nexus prime which I belive will land at sprint before January (contract up). The rumors are that will use OMAP 4660 clocked at 1.5. Just rumors I know. But how will that compare to the snapdragon in terms of speed and battery?
Sent from my PG86100 using Tapatalk
ROM synergy 318 OC 1.8 (2.3.3 base) literally SMOKED the sgs2, was hitting 4000+ with quadrant advanced, but yeah, scores mean nothing. We should have OC again soon, and get nice shiny scores again.
From what I have been reading, A8, A9, v6, v7 or whatever there is now doesn't really equate to any performance gains. The companies license from ARM or they can create their own SoC based on ARM, so its kind of like saying there's an Intel Core 2 Duo and then a AMD Athlon X2, but they are both based on x86 architecture. There's a lot of confusion regarding the whole A8 A9 terminology, so honestly, I don't think it matters much what ARM revision or whatever our SoC is using in the Evo 3D.
What I would really like to know is if the Asynchronous part of it is making a difference in the scores. Does anyone know this? That is the biggest question I have.
Hard to really say which processor is more powerful; but at this stage in smartphones all the dual cores seem to be powerful enough to where it doesn't matter. Asynchronous vs the other guys may be a different story though. Asynchronous cores means each core can be at a different clock speed, so when we get the next version to android (in October or November) and we get to take full advantage of dual core support we may have significantly better battery life than them.
So to elaborate on what you want i guess: Asynchronous cores has nothing to do with the benchmarks because these benchmarks are only running one core anyway (i'm pretty sure).
sprinttouch666 said:
Hard to really say which processor is more powerful; but at this stage in smartphones all the dual cores seem to be powerful enough to where it doesn't matter. Asynchronous vs the other guys may be a different story though. Asynchronous cores means each core can be at a different clock speed, so when we get the next version to android (in October or November) and we get to take full advantage of dual core support we may have significantly better battery life than them.
Click to expand...
Click to collapse
Ok. Now, what about performance wise? Will we be at an advantage or disadvantage?
lyon21 said:
Ok. Now, what about performance wise? Will we be at an advantage or disadvantage?
Click to expand...
Click to collapse
Check this out if you are worried about performance. I think this pretty much sums up how powerful the new snapdragon chipset
http://www.qualcomm.com/blog/2011/04/27/next-gen-snapdragon-dual-core-mdp
lyon21 said:
What I would really like to know is if the Asynchronous part of it is making a difference in the scores. Does anyone know this? That is the biggest question I have.
Click to expand...
Click to collapse
Depends... If you are benchmarking with a non multithreaded app like quadrant, it doesn't matter as you're running on a single core on both. A9 will be faster. And if you're running a multithreaded benchmark that fully uses both cores then the "asynchronous" thing goes out of play as you're using both cores on both devices.
Sent from my PG86100 using XDA App
il Duce said:
ROM synergy 318 OC 1.8 (2.3.3 base) literally SMOKED the sgs2, was hitting 4000+ with quadrant advanced, but yeah, scores mean nothing. We should have OC again soon, and get nice shiny scores again.
Click to expand...
Click to collapse
Well, then if you overclock an A9 to 1.8 ghz you're back to square one and A9 is still faster. I think Qualcomm has already announced their roadmap and a A9 killer is on its way. I think its a quad core with adreno 3xx (will also have dual core with updated architecture to beat A9, but then ARM is coming up with the A15 Hahaha, the never ending race)
Sent from my PG86100 using XDA App
sn0b0ard said:
From what I have been reading, A8, A9, v6, v7 or whatever there is now doesn't really equate to any performance gains. The companies license from ARM or they can create their own SoC based on ARM, so its kind of like saying there's an Intel Core 2 Duo and then a AMD Athlon X2, but they are both based on x86 architecture. There's a lot of confusion regarding the whole A8 A9 terminology, so honestly, I don't think it matters much what ARM revision or whatever our SoC is using in the Evo 3D.
Click to expand...
Click to collapse
Yes it matters, like your comparison, each chip has new sets of instructions, pipelines and optimization. Clock for clock, and like other guy said our snapdragons are between an A8 and A9 and the A9 is simply faster. Ours is an older architecture. By no means a slouch, but its the truth.
Sent from my PG86100 using XDA App
jamexman said:
Yes it matters, like your comparison, each chip has new sets of instructions, pipelines and optimization. Clock for clock, and like other guy said our snapdragons are between an A8 and A9 and the A9 is simply faster. Ours is an older architecture. By no means a slouch, but its the truth.
Sent from my PG86100 using XDA App
Click to expand...
Click to collapse
See, here's the thing. Qualcomm doesn't just go stock ARM architecture. They licensed the technology and made their own snapdragon chipset. Is the snapdragon chipset family old? Yes, it has been around for a while. Is the chipset that is in the Evo 3D old? Not really. It was just developed by Qualcomm relatively recently and expands on their existing, proven QSD chipset. This is like comparing apples to oranges, they are just two different SoCs. If you were to take an absolutely stock ARMv9 and put it against an absolutely stock ARMv7/8, then yes, the ARMv9 obviously is going to win, but these companies try and market that their CPUs are one version higher than others, when in all reality, they modify the hell out of the ARM architecture to make their chipsets.
sn0b0ard said:
Check this out if you are worried about performance. I think this pretty much sums up how powerful the new snapdragon chipset
http://www.qualcomm.com/blog/2011/04/27/next-gen-snapdragon-dual-core-mdp
Click to expand...
Click to collapse
Totally off topic Sorrrry!!!
Just followed the link above and WOW!! how can we con Qualcom into giving us a copy of that home launcher they use with the live wallpaper as well..HMMMMM
jamexman said:
Well, then if you overclock an A9 to 1.8 ghz you're back to square one and A9 is still faster. I think Qualcomm has already announced their roadmap and a A9 killer is on its way. I think its a quad core with adreno 3xx (will also have dual core with updated architecture to beat A9, but then ARM is coming up with the A15 Hahaha, the never ending race)
Sent from my PG86100 using XDA App
Click to expand...
Click to collapse
It is much harder to push a A9 based SOC to 1.8 Ghz compared to the A8 based MSM8660. Clock per clock, A9 will be faster. The A9 has greater IPC and a shorter pipeline, but this also prevents the A9 from running at as high frequencies as an A8 based SOC. How many 1.8 Ghz Exynos chips do you see? In some regards the MSM8660 clearly beats some A9 based SOCs like the Tegra 2 which even lacks hardware support for NEON instructions. Snapdragons have also always traditionally had high floating point performance too.
Also there is no competition between Qualcomm and ARM. Qualcomm simply licenses designs from ARM and then customizes them for its own needs.

Lenovo 'Transformer' IdeaTab S2

So will you guys be swapping your Asus Transformer Prime for a similar product? Im sure most people are purchasing this due to the extra keyboard dock or tegra 3.
EDIT: Personally I'll be sticking with Asus Prime for now, its a good device.
Specification:
10.1" Screen IPS Display
Qualcomm Snapdragon 8960 (28mn TSMC) Dual-Core 1.7Ghz / Adreno 225 GPU 400 Mhz (Overclocked Adreno 220 + Better driver)
20 Hour battery Life
Keyboard Dock like Asus Transformer
16/32/64gb
The GPU is just on par with Mali 400MP which is a shame (GLBenchmark) but that is early benchmark.
Overclocking should be alot better for the CPU, since its a 28mn, I guess reaching over 2.0Ghz is fine!
Information:
Lenovo Idea Tab S 2
We need to start the review by mentioning that there may be certain ambiguities in the specification listed here for Lenovo Idea Tab S 2 since it’s actually not the official release. But as the prior experiences suggest, these information are normally bound to be true. So let us proceed with them. The Lenovo Idea Tab S 2 is to have 10.1 inches IPS display with a resolution of 1280 x 800 pixels which would be a state of the art screen panel and resolution. It will have 1.5GHz Qualcomm Snapdragon 8960 dual core processor with 1GB of RAM. This beast of hardware is controlled by Android OS v4.0 IceCreamSandwich and Lenovo has included a completely modified UI called Mondrain UI for their Idea Tab.
It comes in three storage configurations, 16 / 32 / 64 GBs with the ability to expand the storage using a microSD card. It features 5MP rear camera with auto focus and geo tagging with Assisted GPS and while the camera isn’t that good, it has decent performance verifiers. Idea Tab S 2 will come in 3G connectivity, not 4G connectivity which certainly is a surprise and it also has Wi-Fi 801.11 b/g/n for continuous connectivity and they claim that this tablet can control a smart TV so we assume they have some variation of DLNA included in Idea Tab S 2 as well. Following the footsteps of Asus, Lenovo Idea Tab S 2 also comes with a keyboard dock that has some additional battery life as well as additional ports and an optical track pad. It’s such a good concept to be replicated from Asus and we reckon it would be a deal changer for Lenovo Idea Tab S 2.
Lenovo has also made their new Tablet rather thin scoring a mere 8.69mm of thickness and 580g of weight which is surprisingly light. The inbuilt battery can score up to 9 hours as per Lenovo and if you hook it up with the keyboard dock, 20 hours of total battery life is guaranteed by Lenovo which is a very good move.
Click to expand...
Click to collapse
Video: http://www.youtube.com/watch?v=vWAOmO4LUIo
I certainly won't be going through the trouble of changing to this. This doesn't really look to add anything of value for me (don't need gps and my wifi works fine), and if pricing from lenovo in the past stays true this will likely be more expensive then the equivalent primes.
MrPhilo said:
The GPU is just on par with Mali 400MP which is a shame (GLBenchmark) but that is early benchmark.
Click to expand...
Click to collapse
That's surprising because of the GFLOPS specs for the GPUs:
Tegra 3 Kal-El: 7.2 GFLOPS
Qualcomm 8960 Adreno 225: 19.2 GFLOPS
PowerVR SGX543MP2: 19.2 GFLOPS
And per Anandtech "Qualcomm claims that MSM8960 will be able to outperform Apple's A5 in GLBenchmark 2.x at qHD resolutions." Of course, Qualcomm would say that but even if it is on par with the iPad2 (543MP2) it will still significantly outperform the Tegra3.
L3rry said:
That's surprising because of the GFLOPS specs for the GPUs:
Tegra 3 Kal-El: 7.2 GFLOPS
Qualcomm 8960 Adreno 225: 19.2 GFLOPS
PowerVR SGX543MP2: 19.2 GFLOPS
And per Anandtech "Qualcomm claims that MSM8960 will be able to outperform Apple's A5 in GLBenchmark 2.x at qHD resolutions." Of course, Qualcomm would say that but even if it is on par with the iPad2 (543MP2) it will still significantly outperform the Tegra3.
Click to expand...
Click to collapse
Yes but driver is the most important. Since Tegra 3 Ka el is clocked higher than 300Mhz, the 7.2 GFLOPs doesn't count.
I'd doubt it'll significantly outperform the Tegra 3 GPU. Just like the Adreno 220 was meant to be better but isn't much different.
Even Qualcomm admited that it'll only have 50% more performance than its current Adreno 220.
FML, GLBenchmark took down Asus TF202 with the GPU. It just performed lower than the Mali GPU, wish I saved the website.
With Adreno 225 Qualcomm improves performance along two vectors, the first being clock speed. While Adreno 220 (used in the MSM8660) ran at 266MHz, Adreno 225 runs at 400MHz thanks to 28nm. Secondly, Qualcomm tells us Adreno 225 is accompanied by "significant driver improvements". Keeping in mind the sheer amount of compute potential of the Adreno 22x family, it only makes sense that driver improvements could unlock a lot of performance. Qualcomm expects the 225 to be 50% faster than the outgoing 220
Click to expand...
Click to collapse
MrPhilo said:
FML, GLBenchmark took down Asus TF202 with the GPU. It just performed lower than the Mali GPU, wish I saved the website.
Click to expand...
Click to collapse
Yes, I saw that comment posted in another tread and I tried to google it but could not find it. Hopefully, Anandtech will put something out soon once demos for these newer tablets are available.
I've personally had a lot of headaches in the past with Lenovo laptops so I doubt I'll be making another Lenovo purchase. (Google "Y530 Lenovo Hinges" if you're interested in the issue- it was a common problem due to faulty design.)
The powerVR and Adreno have much more efficient rendering methods than the Tegra chips, so this tablet is no pushover at all.
I wouldn't be surprised if real world performance is better than the tegra 3 outside of tegra 3 specific apps.
hey
The Adreno 225 + SGX 543 MP2 both get 19.2 gflops @300mhz. we dont know the clock speed of the A5 but we can speculate that its probably around the 250-300mhz range.
That makes the Adreno(@400) more powerfull in flops than even the A5/tegra 3, however flops dont tell the whole story, as the A5 has twice the number of TMU's so has a higher fill rate clock for clock and better texturing capability.
The A5 will likely have more ROPs as well, but i dont know that.
The A5 will also have slightly higher bandwidth i think.
Looking at what Anand has said, the adreno 220, only had single channel memory=low bandwidth, it also probably poor effeciency in getting data to the shaders, i think Power vr are more effecient than adreno 2xx series.
The drivers on Adreno were not very good either, indeed some developers on this forum have managed to DOUBLE the adreno [email protected] using the newist Adreno drivers from qualcomm, i think shaky153 was leading the charge with.
I would be very suprised if the Adreno 225 equaled the A5, but it might equal or slightly beat the tegra 3..especially at higher resolutions due to tegras lack of bandwidth.
I don't understand why Nvidia doesn't announce the GPU clock speed!! they detailed it with T2! which means there is something to hide
AP25 was 400Mhz, so T3 shouldn't be under 400mhz
this discussion would be a lot easier if we know the actual clock speed
Prime/Nvidia rules!
Plus Lenovo had No developement support at all. And they are one of the slowest to release firmware updates. Everything is basically dead in Lenovo land.
It seems OK. But nothing enticing to make me think twice about trading my Prime. PRIME is just to cool all around.

iPad 4 vs 5250 (Nexus 10 Soc) GLBenchmark full results. UPDATE now with Anandtech!!

XXXUPDATEXXX
Anandtech have now published the perfromance preview of the Nexus 10, lets the comparison begin!
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Well, the first full result has appeared on GLBenchmark for the iPad4, so I have created a comparison with the Samsung Arndale board, which uses exactly the same SoC as the Nexus 10, so will be very close in performance to Google's newest tablet. GLBenchmark, as its name suggests test Open GL graphics perrformance, which is important criteria for gaming.
Which device wins, click the link to find out.
http://www.glbenchmark.com/compare....ly=1&D1=Apple iPad 4&D2=Samsung Arndale Board
If you're really impatient, the iPad 4 maintains it lead in tablet graphics, the Nexus 10 may performance slightly better in final spec, but the underlying low-level performance will not change much.
I've also made a comparison between the iPad 3 & 4.
Interestingly the in-game tests GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) :, which is run independent of native screen resolution show the following
iPad 4: 48.6 FPS
iPad 3: 25.9 FPS
5250 : 33.7 FPS
So the iPad is twice as fast as its older brother, the Exynos will probably score nearer 40 FPS in final spec, with new drivers and running 4.2, the board runs ICS, however Jelly Bean did not really boost GL performance over ICS. What is interesting is that iPad 4, whose GPU is supposed to clocked at 500 MHz vs 250 MHz in the iPad 3 does not perform twice as fast in low-level test.
Fill Rate, triangle throughtput, vertex output etc is not double the power of the iPad 3, so although the faster A6 cpu helps, I reckon a lot of the improvement in the Egypt HD test is caused by improved drivers for the SGX 543 MP4 in the Pad 4. The Galaxy S2 received a big jump in GL performance when it got updated Mali drivers, so I imagine we should see good improvements for the T604, which is still a new product and not as mature as the SGX 543.
http://www.glbenchmark.com/compare....tified_only=1&D1=Apple iPad 4&D2=Apple iPad 3
I'd imagine the new new iPad would take the lead in benchmarks for now as it'll take Sammy and Google some time to optimize the beast, in the end however actual app and user interface performance is what matters, and reports are overwhelmingly positive on the Nexus 10.
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
Click to expand...
Click to collapse
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Damn..now I have to get an iPad.
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
Click to expand...
Click to collapse
In areas where the Mali 400 lacked performance, like fragment and vertex lit triangle output T604 is comfortably 5 x the performance, whereas in these low-level tests iPad is not a concrete 2x the power of iPad 3, but achieves twice the FPS in Egypt HD than its older brother. I suspect drivers are a big factor here, and Exynos 5250 will get better as they drivers mature.
hot_spare said:
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
Click to expand...
Click to collapse
I agree with most of what you have said. On the GPixel figure this is like ATI GPU teraflops figures always being much higher than Nvidia, in theory with code written to hit the device perfectly you might see that those high figures, but in reality the Nvidia cards with lower on paper numbers equaled or beat ATI in actual game FPS. It all depends on whether the underlying architecture is as efficient in real-world tests, vs maximum technical numbers that can't be replicated in actual game environments.
I think the current resolution of the iPad / Nexus 10 is actually crazy, and will would see prettier games with lower resolutions, the amount of resources needed to drive those high MP displays, means lots of compromises will be made in terms of effects / polygon complexity etc to ensure decent FPS, especially when you think that to drive Battlefield 3 at 2560 x 1600 with AA and high textures, requires a PC that burn 400+ watts of power, not a 10 watt SoC.
Overall when we consider that Nexus 10 has twice the RAM for game developers to use and faster CPU cores, games should look equally as nice as both, the biggest effect will be the level of support game developers provide for each device, the iPad will probably be stronger i that regard. Nvidia was able to coax prettier games out of Tegra 3 through developer support, hopefully Google won't forget the importance of this.
What's the point of speculation? Just wait for the device to be released and run all the test you want to get confirmation on performance. Doesn't hurt to wait
BoneXDA said:
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Click to expand...
Click to collapse
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
I can't wait to see this Exynos 5250 in a 2.0ghz quad-core variant in the semi near future... Ohhhh the possibilities. Samsung has one hell of a piece of silicon on their hand.
Chrome
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Turbotab said:
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Click to expand...
Click to collapse
True.. Chrome on mobile is still not upto desktop level yet. I believe it's v18 or something, right? The stock browser would have much better result in SunSpider/Browsermark. The N4 numbers looks even worse. Somewhere the optimizations isn't working.
The GLbenchmark tests are weird. Optimus G posts much better result than N4 when both are same hardware. It infact scores lower than Adreno 225 in some cases. This is totally whacked.
For N10, I am still wondering about fill rate. Need to check what guys say about this.
Is it running some debugging code in the devices at this time?
Turbotab said:
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
Click to expand...
Click to collapse
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
hung2900 said:
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Click to expand...
Click to collapse
I have to disagree, this is from ARM's info site.
The ARM Cortex-A15 MPCore processor has an out-of-order superscalar pipeline with a tightly-coupled low-latency level-2 cache that can be up to 4MB in size. The Cortex-A15 processor implements the ARMv7-A architecture.
The ARM Cortex-A9 processor is a very high-performance, low-power, ARM macrocell with an L1 cache subsystem that provides full virtual memory capabilities. The Cortex-A9 processor implements the ARMv7-A architecture and runs 32-bit ARM instructions, 16-bit and 32-bit Thumb instructions, and 8-bit Java bytecodes in Jazelle state.
http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.set.cortexa/index.html
Keion said:
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
Click to expand...
Click to collapse
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Harry GT-S5830 said:
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Click to expand...
Click to collapse
Indeed it does,but not in offscreen testing, where Anand made his proclamation.
Sent from my iPad Mini using Tapatalk
Hemlocke said:
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Click to expand...
Click to collapse
Nah, I think we can beat that too.
Drivers + OC.

How does the S4 pro compare to the Exynos 5??

Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Fasty12 said:
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Click to expand...
Click to collapse
The S4 is halfway between the Cortex A9 cores and the new Cortex A15 core that we have. So it is decent enough of a CPU. I am not sure how good of a GPU that is. None of my devices the past couple years have had Adreno GPU's At least it wont have to work as hard with the lower resolution
Fasty12 said:
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Click to expand...
Click to collapse
What stuttering are you talking about?
Draw your own conclusions.
S4 Pro - http://www.anandtech.com/show/6112/...agon-s4-apq8064adreno-320-performance-preview
Exynos 5 - http://www.anandtech.com/show/6148/samsung-announces-a15malit604-based-exynos-5-dual
From everything I've seen and experienced the exynos 5 is the better of the two. The a15 is a more powerful core than the krait core, that with the higher clock speeds and the better GPU makes for a better chip. Personally I have never had my n10 lag at all. Maybe you just got a dud?
Sent from my Nexus 10 using Tapatalk HD
enik_fox said:
From everything I've seen and experienced the exynos 5 is the better of the two. The a15 is a more powerful core than the krait core, that with the higher clock speeds and the better GPU makes for a better chip. Personally I have never had my n10 lag at all. Maybe you just got a dud?
Sent from my Nexus 10 using Tapatalk HD
Click to expand...
Click to collapse
But the Exynos 5 has to run that massive screen res. Also, the reason I think that Qualcomm modified the core was because of the power consumption. Stock A-15 core consumes quite a lot of power.
Sent from my LG-P990 using xda app-developers app
Fasty12 said:
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Click to expand...
Click to collapse
Every now and then I read ppl complaining about lags and stutters... I have not experienced one since I have the device; can you please explain what you are doing when this happens?
avdaga said:
Every now and then I read ppl complaining about lags and stutters... I have not experienced one since I have the device; can you please explain what you are doing when this happens?
Click to expand...
Click to collapse
Try opening and closing google maps after the map has been loaded there is a NOTICEABLE frame rate drop compare to other apps.
kaspar737 said:
But the Exynos 5 has to run that massive screen res. Also, the reason I think that Qualcomm modified the core was because of the power consumption. Stock A-15 core consumes quite a lot of power.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
The Snapdragon S4 does not use an A15 core or any derivative of an A15. Qualcomm has ALWAYS completely designed their cores custom and has almost nothing to do with the current major core from ARM's reference design. Additionally, the S4 was designed and released before the A15 MP-Core was even finished with its design phase.
The Krait core uses a similar (but not the same) triple wide decode stage like the A15 core, but it uses a completely different 11 stage execution pipeline compared to the A15's 15 stage pipeline. The higher stages of the pipeline allow the A15 design to break things down smaller and achieve higher frequency, but if there were to be a failure in computing then the A15 must wait a longer time before it can start over where the Krait core doesnt have to wait as long, but also isnt as efficient in " normal" circumstances. Honestly the integer performance between the two cores is pretty close, but I think I remember seeing that the A15 has a lot stronger floating point performance. So I guess it really depends on your workload.
FYI, the Exynos 5, Tegra 4, and TI OMAP 5 processors are all based on the A15 core design. Qualcomm is the only major player who does not base their processors on the ARM design
Fasty12 said:
Try opening and closing google maps after the map has been loaded there is a NOTICEABLE frame rate drop compare to other apps.
Click to expand...
Click to collapse
Do you mean a drop in framerate during the animation when closing Maps? I notice a minor framerate drop which lasts as long as the animation does, but if that is it, I'm kinda wondering why you bought an android device in first place... I have not noticed this before and I cannot imagine anyone would using the device for its intended purposes. If you take any android device, you will find a fps drop at some point... Maybe return it and take an iPad? iPads do not have the issue, on the other hand there's a lot that iPads do not have ^^
kaspar737 said:
But the Exynos 5 has to run that massive screen res. Also, the reason I think that Qualcomm modified the core was because of the power consumption. Stock A-15 core consumes quite a lot of power.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
Exynos has higher memory bandwidth so the difference isn't substantial.
Sent from my Galaxy Nexus using Tapatalk 2
---------- Post added at 01:33 PM ---------- Previous post was at 01:29 PM ----------
EniGmA1987 said:
The Snapdragon S4 does not use an A15 core or any derivative of an A15. Qualcomm has ALWAYS completely designed their cores custom and has almost nothing to do with the current major core from ARM's reference design. Additionally, the S4 was designed and released before the A15 MP-Core was even finished with its design phase.
The Krait core uses a similar (but not the same) triple wide decode stage like the A15 core, but it uses a completely different 11 stage execution pipeline compared to the A15's 15 stage pipeline. The higher stages of the pipeline allow the A15 design to break things down smaller and achieve higher frequency, but if there were to be a failure in computing then the A15 must wait a longer time before it can start over where the Krait core doesnt have to wait as long, but also isnt as efficient in " normal" circumstances. Honestly the integer performance between the two cores is pretty close, but I think I remember seeing that the A15 has a lot stronger floating point performance. So I guess it really depends on your workload.
FYI, the Exynos 5, Tegra 4, and TI OMAP 5 processors are all based on the A15 core design. Qualcomm is the only major player who does not base their processors on the ARM design
Click to expand...
Click to collapse
THANK YOU!! my god I've had to explain this so many times! Qualcomm licenses ONLY the armv7 instructions and not arms designs. They design their own chips from the ground up and GPU, so please people stop saying Qualcomm is a cortex series processor because it isn't. Samsung and the rest license arms design and modify it, in Samsungs case they tend to increase the IPC slightly and give it more memory bandwidth.
Also to answer the question, exynos 5 will do better at higher resolutions and they will be very close in lower resolutions. S4 will be better in multi thread workloads more then likely and exynos will have better float performance. Exynos is better for games once the thermal throttling is fixed.
Sent from my Galaxy Nexus using Tapatalk 2
ECOTOX said:
Exynos has higher memory bandwidth so the difference isn't substantial.
Sent from my Galaxy Nexus using Tapatalk 2
---------- Post added at 01:33 PM ---------- Previous post was at 01:29 PM ----------
THANK YOU!! my god I've had to explain this so many times! Qualcomm licenses ONLY the armv7 instructions and not arms designs. They design their own chips from the ground up and GPU, so please people stop saying Qualcomm is a cortex series processor because it isn't. Samsung and the rest license arms design and modify it, in Samsungs case they tend to increase the IPC slightly and give it more memory bandwidth.
Also to answer the question, exynos 5 will do better at higher resolutions and they will be very close in lower resolutions. S4 will be better in multi thread workloads more then likely and exynos will have better float performance. Exynos is better for games once the thermal throttling is fixed.
Sent from my Galaxy Nexus using Tapatalk 2
Click to expand...
Click to collapse
The 50 percent extra memory bandwith doesn't matter so much considered that the Exynos has to move almost twice the amount of pixels.
Sent from my LG-P990 using xda app-developers app
kaspar737 said:
The 50 percent extra memory bandwith doesn't matter so much considered that the Exynos has to move almost twice the amount of pixels.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
Memory bandwidth makes a pretty big difference when it comes too resolutions. I.E 8600gt ddr2 vs ddr3. Wider memory bus and faster memory makes a big difference in higher res performance of any GPU
Also will help with GPU compute performance for future apps utilizing the Mali t604s compute abilities
Sent from my Galaxy Nexus using Tapatalk 2
ECOTOX said:
Memory bandwidth makes a pretty big difference when it comes too resolutions. I.E 8600gt ddr2 vs ddr3. Wider memory bus and faster memory makes a big difference in higher res performance of any GPU
Also will help with GPU compute performance for future apps utilizing the Mali t604s compute abilities
Sent from my Galaxy Nexus using Tapatalk 2
Click to expand...
Click to collapse
But lets say that Exynos uses the whole 12.8 gb/s bandwith. That means that to move twice as less pixels you would need 6.4 gb/s so memory bandwidth isn't an issue.
Sent from my LG-P990 using xda app-developers app
kaspar737 said:
But lets say that Exynos uses the whole 12.8 gb/s bandwith. That means that to move twice as less pixels you would need 6.4 gb/s so memory bandwidth isn't an issue.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
But that bandwidth is shared, unlike on dedicated GPU where it isn't. The total system bandwidth (not including buses for modem or w.e others are there) on the exynos chip being higher is gonna give it the edge in any situation considering the closeness in performance between the two. It also can't be denied that the Mali t604 has a edge in horse power over adreno 320 because even at the n10s resolution it comes within a couple fps of adreno at 1080p resolution. Not saying it's a big difference, but the exynos is the more powerful all around chip and that's just in is dual core form.
Edit: Also its a known fact that Adreno has crap fill rate compared too Mali or Power VR, Adrenos Strength is Geometry performace so it takes more of a hit the higher the resolution than Either the Mali t604 or the SGX 554MP4 which both have higher Fillrate and the SoC we have to compare both have higher bandwidth to facilitate that so we dont get bottle necked.
Sent from my Galaxy Nexus using Tapatalk 2
Finally a lot of exerts here about GPU
, I know it is not related to topic but me and my girlfriend have Galaxy Note 2 and S3. As you all know they are the last ones to use the elder Mali-400 GPU. I love playing games and I am getting my girlfriend used to them too. So I was wondering how is our Mali400 GPU holding up against the new coming 1080p Adreno 320 devices? It is clear the future is 1080p. I am either planning to switch our devices with a couple of Nexus 4s or Xperia Zs. Because I fear our devices are about to be outdated with the next game right around the corner. So far they are doing just fine with Modern Combat 4 and the all other graphic intensive games by playing over 28-30 FPS. But according to the GLBenchmark 2.5Egypt they are useless against new Adreno 320. However I have read that most of the games were designed for high fill rate power and Mali 400 is able to beat Adreno 320. But on the triangle tests, it just bottlenecks.
So what is your opinion about it? I will our devices do another year and half for the new games? Or should I make the trade? Or should I just buy a Nexus 10 with 2 users assigned and continue games on it? I
Thank you for reading.
Sent from my GT-N7100 using xda app-developers app
_delice_doluca_ said:
Finally a lot of exerts here about GPU
, I know it is not related to topic but me and my girlfriend have Galaxy Note 2 and S3. As you all know they are the last ones to use the elder Mali-400 GPU. I love playing games and I am getting my girlfriend used to them too. So I was wondering how is our Mali400 GPU holding up against the new coming 1080p Adreno 320 devices? It is clear the future is 1080p. I am either planning to switch our devices with a couple of Nexus 4s or Xperia Zs. Because I fear our devices are about to be outdated with the next game right around the corner. So far they are doing just fine with Modern Combat 4 and the all other graphic intensive games by playing over 28-30 FPS. But according to the GLBenchmark 2.5Egypt they are useless against new Adreno 320. However I have read that most of the games were designed for high fill rate power and Mali 400 is able to beat Adreno 320. But on the triangle tests, it just bottlenecks.
So what is your opinion about it? I will our devices do another year and half for the new games? Or should I make the trade? Or should I just buy a Nexus 10 with 2 users assigned and continue games on it? I
Thank you for reading.
Sent from my GT-N7100 using xda app-developers app
Click to expand...
Click to collapse
They will hold on, my SGS2 runs all of the current games at the highest settings ( I haven't tried GTA though) without any issues, the Adreno 320 is far better than the Mali 400 MP4 though
Ya, Im pretty sure they will still play games a year from now. Until the market is completely saturated with devices like the Nexus 10 in power we wont really see large jumps in system requirements. That will probably only happen a year or two from now once all the new phones and tablets are made with A15 processors (or Qualcomm equivalent) and beefy GPUs.
Fidelator said:
They will hold on, my SGS2 runs all of the current games at the highest settings ( I haven't tried GTA though) without any issues, the Adreno 320 is far better than the Mali 400 MP4 though
Click to expand...
Click to collapse
The S2(Mali400) plays GTA3 without a hiccup.
The exynos dual is very power hungry compared to the s4pro but it is also the most powerful arm processor out today. Nothing else yet released (I said RELEASED) is as powerful or can match its bandwidth. Having said that I'm sure a normal resolution 1080p screen in this form factor with the s4pro would be a nice fast tablet. Right now the exynos dual is pretty much the only thing outside apple that can push the resolution that the n10 has. I think if they had put another gig of ddr3 in this thing there wouldn't be so much stuttering in certain instances. Besides the thermal cutoff the n10 is starved for memory as it has to share normal duties and its ram with the graphical load of pushing all the pixels of this monster resolution. You are lucky to have 300mb of ram available at idle on the n10 vs over a gig available with the s4pro on the 720p screen of the nexus 4
Sent from my often RMA'd Nexus 4, So that I can use the one I'm using now when I get the 6th and hopefully final one.

Categories

Resources