RLY?! Xperia x10 gets ISC port but not atrix? - Atrix 4G Q&A, Help & Troubleshooting

X10 is garbage! this is outrageous!

Yes really, they got it working, you want it so bad try porting it yourself
Sent from my MB860 using XDA App

cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.

dLo GSR said:
cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.
Click to expand...
Click to collapse
Oh snap. That was awesome.
Sent from my MB860 using XDA App

I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App

firefox3 said:
I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Good news man
Sent from my MB860 using XDA App

Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.

Javi97100 said:
Good news man
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Its turning out to be harder then i though... I think no one will get it until offical updates come out for other phones

Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
So EGL = gpu driver? If thats the only setback, would it be possible to get an ICS rom with software rendering as a proof of concept, or are there other pieces missing?
GB/CM7 is pretty good on the Atrix, if we dont see ICS for a few months it doesn't hurt us in any way. I'd like to think most of us can be patient if we lack the skills to help.

I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.

According to anandtech, Tegra 2 support is essentially ready, so I think as long as nvidia releases the source for ics (libs?), someone will try to port it. Hell, I have a good 5 weeks during break, I might as well try then.
Sent from my MB860 using XDA App

Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.

edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Buddy, check out any of the kernels available in the dev thread and you'll see that the GPUs are overclocked.

WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
Doubt the iPhone will see ICS, the newest model that can run android as far as I know is the iPhone 3G, which was incredibly slow under Gingerbread.

mac208x said:
X10 is garbage! this is outrageous!
Click to expand...
Click to collapse
222 posts and zero thanks? Is this what you do, go around XDA and post useless threads like the guy complaining about returning home early despite nobody asking him to "to get MIUI ported on his grandma's phone"?
Are you guys related by any chance?

edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)

Azurael said:
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)
Click to expand...
Click to collapse
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.

edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
Don't you get tired if writing those long rants? We understand you know something about CPU architecture, and that Tegra isn't the best one out there, but damn man, it's the same thing in every thread. Just chill out and try to stay on topic for once
Sent from my MB860 using Tapatalk

edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
I think you are not seeing the whole picture...
The Tegra 3 (Et-Al) is not just about its quad core implementation, remember that the GPU will offer 12 cores that will translate in performance not seeing as of yet on any other platform.
Benchmarks don't tell the whole story! Specially those benchmarking tools which are not Tegra 3 optimized yet.
Cheers!
Sent from my Atrix using Tapatalk

WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
LOL I ran all the iDroid ports on my iphone. Not one was even in alpha stage, I would not even count iDroid as a port since you cant use anything on it.

Related

So what gives with these lousy benchmarks?

I finally found a comparable tegra 2 bench posted online in a droid x 2 review, both devices have a qHD screen. It's looking like the hardware we have here isn't particularly impressive, and let's not even go there with the Galaxy s 2 *shudder*, it's a massacre.
I was to understand that the Qualcomm/Adreno setup was going to at least be competitive, and was supposed to be all out superior to Tegra 2. Can anyone shed some light on this?
Levito said:
I finally found a comparable tegra 2 bench posted online in a droid x 2 review, both devices have a qHD screen. It's looking like the hardware we have here isn't particularly impressive, and let's not even go there with the Galaxy s 2 *shudder*, it's a massacre.
I was to understand that the Qualcomm/Adreno setup was going to at least be competitive, and was supposed to be all out superior to Tegra 2. Can anyone shed some light on this?
Click to expand...
Click to collapse
I don't look at benchmarks too much... but it can download n' upload like a God that's its power tool
My overlocked 1.5 Ghz tegra 2 lags behind my EVO 3D but it scores 900 more points in quadrant so my epeen feels alright. Seriously most of these benchmarks are not coded well.
I think the 3vo uses only one core with quadrant. You have to use a dual core benchmark test like CF Bench for better results. Then again benchmarks really don't mean much.
Sent from my PG86100 using Tapatalk
Benchmarks are nearly useless measures.
Using benchmarks to determine real world performance is like licking your finger and sticking it up in the air to determine how fast the wind is moving.
Yeah, it'll put you roughly in the ballpark--roughly. But that ''ballpark'' is big enough to drive a couple dump trucks through...
Both the droid x2 and the galaxy s2 aren't running sense, which usually drags down bench marks even though the phone is silky smooth. Benchmarks may be useful for testing modifications on the same phone, but not for comparing different phones. Just ask yourself... Does it seem to suffer to you?
Sent from my PG86100 using XDA App
Who gives a #$% about benchmarks, all I know is that this thing is fast, way faster than the EVO. I have a gTablet (tegra 2, Honeycomb) that runs games very well and this 3VO runs the same games but only smoother and faster, no hiccups at all. Totally happy here and I have like 200 apps on this thing and I have like 280 megs left.
Oh, and my gTablet is clocked to 1.5ghz!
G_Dmaxx said:
Who gives a #$% about benchmarks, all I know is that this thing is fast, way faster than the EVO. I have a gTablet (tegra 2, Honeycomb) that runs games very well and this 3VO runs the same games but only smoother and faster, no hiccups at all. Totally happy here and I have like 200 apps on this thing and I have like 280 megs left.
Oh, and my gTablet is clocked to 1.5ghz!
Click to expand...
Click to collapse
Seriously my Tegra 2 Transformer has nothing on my EVO 3D. Why people look only at benchmarks and not what is in front of them I have no clue.
danaff37 said:
Both the droid x2 and the galaxy s2 aren't running sense, which usually drags down bench marks even though the phone is silky smooth. Benchmarks may be useful for testing modifications on the same phone, but not for comparing different phones. Just ask yourself... Does it seem to suffer to you?
Sent from my PG86100 using XDA App
Click to expand...
Click to collapse
I've actually never had an AOSP rom run all that much faster than a Sense rom. Enough of a variance to say that there isn't a difference at all.
Like many others have pointed out. Quadrants is a terrible bench for dualcore phones until it's updated. When it reads off a bunch of question marks as the evo3ds CPU, CPU speed,etc. You know its not going to be a reliable test.
Sent from my PG86100 using Tapatalk
Go to anand-tech for the Adreno 220 benches... It crushed the competition so maybe that'll make you feel better.
1 possible reason why the EVO 3D isn't scoring as high as you expect is because I think the benchmark tests don't utilize CPU's with asynchonous dual cores correctly.
Someone correct me if I'm wrong, but I think the Galaxy uses synchonous cores which mean they can only work on the same thing at the same time, they can't work on separate operations at the same time.
The EVO 3D has asynchonous cores which allow for true multitasking meaning each core will work on separate tasks. As I understand it, support for this type of CPU is going to be added in Android 2.4 and later, but don't quote me on that.
LOL @ benchmarks
DDiaz007 said:
Go to anand-tech for the Adreno 220 benches... It crushed the competition so maybe that'll make you feel better.
Click to expand...
Click to collapse
Any similar comparisons to the exynos/mali(?) that the sgs 2 is packing?
Some of the above statements about asynchronous processing do make me feel better if true.
Levito said:
Any similar comparisons to the exynos/mali(?) that the sgs 2 is packing?
Some of the above statements about asynchronous processing do make me feel better if true.
Click to expand...
Click to collapse
Why not feel good in the first place?
This phone screams. You're comparing it to a Moto phone with Tegra 2 which will likely be one of the last new phones with Tegra 2. Enjoy the 3D. By the time something comes around to crush it, we'll be into 4 core territory, or Android will be updated to better support multiple cores (if I remember right, this was only really started for 3.0).
I'll agree the SGS2 seems like a killer but I'll take HTC build quality over Samsung any day of the week. Plus, let's see Exynos pushing qHD.
No I hear you. Truth is that there probably won't be any software written for quite sometime that is going to really push our current hardware. Besides I upgrade every year or so anyway, making future proofing less of an issue for me.
It's the principle of the thing.
Levito said:
No I hear you. Truth is that there probably won't be any software written for quite sometime that is going to really push our current hardware. Besides I upgrade every year or so anyway, making future proofing less of an issue for me.
It's the principle of the thing.
Click to expand...
Click to collapse
I hear ya too, but you gotta try not to get caught up in numbers. Numbers can be manipulated. Manufacturers can tune their phones to perform better in Quadrant (this can also be done with custom ROMs; when it is, performance in other categories suffers). AMD and Intel still participate in this ePeen warfare.
I won't be surprised if we see that Evo 3D outperforms the Tegra Moto overall.
The good thing is, we will eventually see this thing rooted completely (hopefully not after it's lost most of its luster). THEN we will see what we can push out of this phone. Look how fast it's running sense. Imagine a vanilla Android experience on it, or an overclock to say, 1.8 GHz (which will probably happen). I dunno about you but I'm salivating.
Ok, the only benchmark I need to know is that my phone boots up from "off" in 10-12 seconds. Base your satisfaction on a constant, not on relativism.
megatron-g1 said:
1 possible reason why the EVO 3D isn't scoring as high as you expect is because I think the benchmark tests don't utilize CPU's with asynchonous dual cores correctly.
Someone correct me if I'm wrong, but I think the Galaxy uses synchonous cores which mean they can only work on the same thing at the same time, they can't work on separate operations at the same time.
The EVO 3D has asynchonous cores which allow for true multitasking meaning each core will work on separate tasks. As I understand it, support for this type of CPU is going to be added in Android 2.4 and later, but don't quote me on that.
Click to expand...
Click to collapse
Should be no difference to code for asynchronous or synchronous. The cores will run at full speed if they're pushed. Quadrant scores are more based on database read and write speeds than anything.
I've owned many many phones, and this one is by far the most fluid (although I have not had hands on with the Galaxy SII, but I hate Samsung's software)
I haven't run into a case where the phone stutters, have you?
I believe in the Anandtech benchmarks, they used a developer phone that has the same qualcomm chipset running at the stock 1.5ghz, while our phones were downclocked to 1.2ghz.
They might have done this for various reasons, it would be interesting to see how our phones overclock and if there's any changes in battery life.

MSM8660/8260 vs Tegra 2 vs Exynos

I've been looking for a more technical analysis of these SOCs and I have been trying to learn how the async CPU setup on the MSM8660 affects performance.
Nvidia claims that the power saving feature of our CPU (async) will inevitably cause a decrease in performance:
http://www.intomobile.com/2011/03/2...ed&utm_campaign=Feed:+IntoMobile+(IntoMobile)
Does anyone have any comments on this? If this is the case, I am wondering if through software we can force both cores to run at the same voltage/frequency. I wonder if it would cause an increase in performance (at least in benchmarking). Many claim that the Evo 3d only gets medicore benchmark scores due to having asynchronous cores that are not being accurately benched. It would be interesting to verify this claim.
Also, does anyone know which SOC between the three I listed in the title is the highest in performance (not talking about useless benchmarks like quadrant)?
So....there is possibly a 10–15% decrease in performance.....that's fine with me. Most of the time you won't even notice until you run benchmarks and looks at the numbers.
SetCPU + Performance mode are all you should need
DarkManX4lf said:
So....there is possibly a 10–15% decrease in performance.....that's fine with me. Most of the time you won't even notice until you run benchmarks and looks at the numbers.
Click to expand...
Click to collapse
Well the 10-15% slower is nVidia's claim, not sure if its true.
Does that make both cores run at the same time or running cores at the same time not possible due to the processor
xHausx said:
SetCPU + Performance mode are all you should need
Click to expand...
Click to collapse
Sent from my PG86100 using XDA App
ttieder said:
Does that make both cores run at the same time or running cores at the same time not possible due to the processor
Sent from my PG86100 using XDA App
Click to expand...
Click to collapse
It will keep the cpu running at full speed. Which core gets used for what depends on a lot of things but it mostly depends on how the apps and kernel are programmed.
xHausx said:
It will keep the cpu running at full speed. Which core gets used for what depends on a lot of things but it mostly depends on how the apps and kernel are programmed.
Click to expand...
Click to collapse
Yes, but is it possible to keep both cores at their full frequency? Setting the exynos or tegra 2 on performance mode makes both cores stay at their maximum frequency since they are synchronous. I think setting performance mode on the Evo 3d would only guarantee that one of the core will remain at its full frequency.
Not sure about this of course. Anyone have any insight into this?
Second Core wouldnt kick in if ur not heavily multitasking or running multithreaded apps and u wouldnt need second core for minor multitasking or single threaded operations as single core is enough
i will tell you that on paper the msmx60 should beat out all, but in real world use, the exynos hammers everything. the s2 is a beast
The Exynos is the better SoC, plain and simple. If we get into GPU discussions, the Adreno 220 is the best, as in better than Mali 400.. Go to Anandtech, and watch them use a Qualcomm device for the benches.
Sent from my PG86100 using XDA App
Is it a "for sure" thing that ICS will use the GPU acceleration in the OS? Or is that just everyone's hopes and dreams
Sent from my PG86100 using XDA Premium App
You could program the kernel to keep both cores at max frequency. Im not a developer but am sure something like this could be done
Sent from my HTC Sensation Z710e using Tapatalk
bballer71418 said:
Is it a "for sure" thing that ICS will use the GPU acceleration in the OS? Or is that just everyone's hopes and dreams
Sent from my PG86100 using XDA Premium App
Click to expand...
Click to collapse
ics will include all of the features that honeycomb has and honeycomb has 2d acceleration so yes
Sent from my PG86100 using XDA Premium App
Maybe we should make some real world benchmarks and get some SGS2 people in on it. Like how fast a particular app opens(say angry birds), How many fps a game plays at, Convert a file to another format, complete a 5 step plan to take over the world things like that. Alot of things like that are how reviewers rate and test things like new video cards and cpus plus all the benchmark programs.
I used to use a program called fraps to see how many fps my pc games were playing at so I could tweak stuff( long live unreal tournament!!!!). It would just display the fps in the top corner of the screen.
Also comparing the 3vo and SGS2 will really heat up when we get root and CM7. 400mb less roms have gatta make a huge difference on performance. I dunno about you guys but I haven't been able clog up my 3vo yet(and I've been trying!), I'm pretty impressed with the hardware so far.
Drewmungus said:
Maybe we should make some real world benchmarks and get some SGS2 people in on it. Like how fast a particular app opens(say angry birds), How many fps a game plays at, Convert a file to another format, complete a 5 step plan to take over the world things like that. Alot of things like that are how reviewers rate and test things like new video cards and cpus plus all the benchmark programs.
I used to use a program called fraps to see how many fps my pc games were playing at so I could tweak stuff( long live unreal tournament!!!!). It would just display the fps in the top corner of the screen.
Also comparing the 3vo and SGS2 will really heat up when we get root and CM7. 400mb less roms have gatta make a huge difference on performance. I dunno about you guys but I haven't been able clog up my 3vo yet(and I've been trying!), I'm pretty impressed with the hardware so far.
Click to expand...
Click to collapse
Fraps tends to lie with FPS.
Sent from my PG86100 using XDA App
GPU acceleration will be nice. Hope we see ICS soon.
Sent from my EVO 3D w/ Tapatalk
It is known that the MSM8660 can achieve higher clock frequencies than the Exynos, though clock for clock the Exynos has better IPC.
As of right now the GSII beats the 3VO in both benchmarks and real world tests, but I suspect this is because Sense is a pig that takes far too much ram and system resources. HTC also seems to have poorer unoptimized drivers. In addition to this, the async CPUs of the 3VO may not be properly tested by current benchmarking tools.
I think comparing a rooted 3VO and a rooted GSII should be much closer. Imagine the MSM8660 at 1.8-2.0 Ghz both cores running full frequency with no Sense and other bloat to slow it down. Combine that with a hardware accelerated GUI and this phone should be amazing.
The Adreno GPU will get better over time... and will develop much faster than before. Since Qualcomm purchased the branch from AMD (ATi), there has been much improvement in a reasonably small amount of time. There are various claims that the Adreno 220 outperforms the Tegra 2. I havent seen a solid comparison of the Adreno 220 vs the Exynos although I have read that the Exynos is a very capable processor.
As they both stand in stock offering, the Samsung GS2 will be faster; it has tremendous less resources to move. I agree with what has been said about root & rom options: CM7 on the EVO 3D will likely result in unprecedented (real world) benchmarks .Also note that the current Android releases are not yet optimized for dual/quad core management. But rest assured, it is well under development and the Sprint EVO 4G4D (hypothetical name) will behold a treasure trove of menacing capabilities.
HTC + Qualcomm + Android = Future
I think we should just wait until we can do a head-to-head AOSP CM 7 benchmark/real world test to see what happens. I'm confident the SGSII will get shredded by the E3D.
It seems unfair to compare anything within the phone itself now, because of what each phone has to run. Sense is pretty tasking on our phones and I can't say as much for the opposition.
It's funny to see NVIDIA make snide comments about Qualcomm when their phones are getting bested. Although I must say it is impressive to see that Tegra 2 phones are over a year old and keeping up with the E3D's dual-core deliciousness.
Just my thoughts.
Personally I don't believe Nvidia, plenty of benchmarks contradict their statement. That and whoever said "Additionally, the operating systems like Android and many apps aren’t set up for an asynchronous architecture." is an idiot because 99% of apps in the market don't support dual core lmfao.

Nexus Prime/Galaxy to have same GPU as our phone?

According to an article today by Android Police, they have strong confirmation that the Nexus Prime/Galaxy will have a T.I. OMAP 4460 SoC(System on a chip) down clocked from 1.5 to 1.2GHz. The OMAP 4460 has the PowerVR 540 GPU which is what is present in our phones. If this is true, I will probably pass on it. But I did a little research and found out that the T.I. OMAP 4470 SoC is due for late 2011 or early 2012. Perhaps Google/Samsung will work with T.I. to debut this new SoC. The OMAP 4470 has a clock speed of 1.8GHz and contains the PowerVR 544 (more powerful than the iPad 2/iPhone 4S). Surely Google would not want a GPU found in last years models to be in their new flagship phone. What are your thoughts?
Zacisblack said:
According to an article today by Android Police, they have strong confirmation that the Nexus Prime/Galaxy will have a T.I. OMAP 4460 SoC(System on a chip) down clocked from 1.5 to 1.2GHz. The OMAP 4460 has the PowerVR 540 GPU which is what is present in our phones. If this is true, I will probably pass on it. But I did a little research and found out that the T.I. OMAP 4470 SoC is due for late 2011 or early 2012. Perhaps Google/Samsung will work with T.I. to debut this new SoC. The OMAP 4470 has a clock speed of 1.8GHz and contains the PowerVR 544 (more powerful than the iPad 2/iPhone 4S). Surely Google would not want a GPU found in last years models to be in their new flagship phone. What are your thoughts?
Click to expand...
Click to collapse
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
nope, it's samsung. you can take off your tinfoil hat since that was officially confirmed about a year ago.
op, where did you get that information? it's been stated that it will have an exynos processor, the latest and greatest from samsung. I don't have a source but the whole point of the nexus line is to have the best and latest hardware.
Sent from my MIUI SCH-i500
sageDieu said:
nope, it's samsung. you can take off your tinfoil hat since that was officially confirmed about a year ago.
op, where did you get that information? it's been stated that it will have an exynos processor, the latest and greatest from samsung. I don't have a source but the whole point of the nexus line is to have the best and latest hardware.
Sent from my MIUI SCH-i500
Click to expand...
Click to collapse
Not saying it's 100% but 4/5 Android websites have concluded that the OMAP series is the platform of choice for Google's new OS. No tech blog/website has stated it will have Exynos. And the OMAP 4470 would be more powerful either way. But below, Android Police strongly asserted that the new device will have the OMAP 4460 downclocked to 1.2GHz. But like I said, I'm asking for everyone's thoughts because I can definitely see Google surprising us.
http://www.androidpolice.com/2011/1...eam-sandwich-phone-sorry-prime-is-not-likely/
You can also check Engadget, AndroidCentral, Anandtech, Android Authority,and PhanDroid.
tonu42 said:
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
Click to expand...
Click to collapse
You could be partially right. Some rumors have suggested that the Prime and Galaxy Nexus are two different devices. What saddens me is that the Galaxy Nexus I-9250 passed through the FCC with GSM only.
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Sent from my Fascinate with MIUI Gingerbread
TheSonicEmerald said:
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Click to expand...
Click to collapse
184mhz, I think -- almost double. Except the Nexus is going to have 2.4 times the pixels of the Fascinate (or 2.22 if you don't count the soft key area).
tonu42 said:
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
Click to expand...
Click to collapse
oh tonu, still trying to have conversations about things you know nothing about.
Sent from my Incredible 2 using XDA App
TheSonicEmerald said:
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Sent from my Fascinate with MIUI Gingerbread
Click to expand...
Click to collapse
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Zacisblack said:
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Click to expand...
Click to collapse
Hah. Imagine having the PowerVR SGX 543MP4 from the PS vita in the prime. That would run laps around the MP2 XD
Zacisblack said:
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Click to expand...
Click to collapse
I don't understand why google put such a crappy GPU in their flagship phone. They easily could have put the Mali GPU or maybe even the 543MP2. Now I really can't decide between the 4S and the Galaxy Nexus...
cherrybombaz said:
I don't understand why google put such a crappy GPU in their flagship phone. They easily could have put the Mali GPU or maybe even the 543MP2. Now I really can't decide between the 4S and the Galaxy Nexus...
Click to expand...
Click to collapse
They probably put it in to work around the hardware. This means that the Galaxy Prime will run extremely well with ICS probably better than some dual core GPU phones but it will lack in the gaming department. If you don't really game alot it shouldn't matter that much it will be really fast. They've also increase the clock speed from 200Mhz to 386 Mhz which is almost twice as fast.
I thought about the 4S thing too but then I realized, "why have all that power if the system takes little use of it?". The only thing it's really good for is gaming but who want's to do that on a 3.5" screen. At this point, the Nexus is probably a better real world choice but if you wait a few more months the GSII HD LTE or the GS3 will be out and will probably be on par with the iPad 3 in terms of hardware. I was hoping the Nexus would blow me away but it didn't. I like the way it looks but the hardware is just lacking and it's not worth my upgrade or $300.
Very well stated I'm also not all in on the GN. We'll see once I can actually play with one in store next month
Sent from my SCH-I500 using XDA Premium App
Zacisblack said:
They probably put it in to work around the hardware. This means that the Galaxy Prime will run extremely well with ICS probably better than some dual core GPU phones but it will lack in the gaming department. If you don't really game alot it shouldn't matter that much it will be really fast. They've also increase the clock speed from 200Mhz to 386 Mhz which is almost twice as fast.
I thought about the 4S thing too but then I realized, "why have all that power if the system takes little use of it?". The only thing it's really good for is gaming but who want's to do that on a 3.5" screen. At this point, the Nexus is probably a better real world choice but if you wait a few more months the GSII HD LTE or the GS3 will be out and will probably be on par with the iPad 3 in terms of hardware. I was hoping the Nexus would blow me away but it didn't. I like the way it looks but the hardware is just lacking and it's not worth my upgrade or $300.
Click to expand...
Click to collapse
True. But Infinity Blade 2 looks pretty amazing and if more developers can take advantage of the 543MP2, that would be great. But, you can always wait a few more months and something better will always come out, so I don't think its a good idea to wait for the GS3 - and it'll take much more than a few months to get onto US carriers. I agree that $300 is a bit of a hard pill to swallow, especially when you can get a GSII with better hardware for cheaper.

Whats next after quad-core?

So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Fasty12 said:
So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Click to expand...
Click to collapse
Well as octo core desktop CPUs havnt really caught on yet I would guess just better quad cores likely with more powerful GPUs
Tegra 3 is already very powerful, presuming the will increase ram and make them more battery efficient or even higher clock speed. 12 core tegra gpu is pretty amazing already and anything better must be godly
Sent from my HTC Desire using xda app-developers app
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Sounds like octo-mom..the debate.lives on.. battery vs performance...but to answer your question I think it would be hexa-core which is 6..let's wait and see what is to come...
Sent from my SGH-T989 using Tapatalk 2
s-X-s said:
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Click to expand...
Click to collapse
I agree. Cores are at there peak right now. The amount of CPU power we have especially in the higher end phones is enough to acomplish many, many things. RAM is somewhat of an issue especially since multitasking is a huge part of android. I really thing a 2gb RAM should be a standard soon. Also, better gpu's won't hurt
Sent from my HTC T328w using Tapatalk 2
If they decide to keep going on the core upgrade in the next two or so years, I see one of two possibilities happening:
1) Dual Processor phones utilizing either dual or quad cores.
or
2) Hexacore chips since on the desktop market there's already a few 6-core chips (though whether or not they would actually be practical in the phones architecture, no clue).
Generally speaking whatever they come out with next will either need a better battery material, or lower power processors.
I mean I'm pretty amazed by what my brother's HTC One X is capable of with the quad core, and here I am still sporting a single-core G2. But yes I would like to see more advancement in RAM usage, we got a nice bit of power, but how bout a standard 2GB ram for better multitasking?
I believe 2013 will be all about more efficient quad-cores.
May i ask what going from 1gb to 2gb will improve? Loading times?
hello everyone, could you tell me what is cuad core?
Quad core means that a processor has four processing units.
Because there are more, that means that a process, theoretically, gets executed 4 times faster.
Read more about it: http://simple.wikipedia.org/wiki/Multi-core_processor
Maybe i7 in mobile devices?
I'm sure it will stay at quad core cpu's, anything more is just overkill. They may introduce hyperthreading. It's going to boil down to efficiency.
Sent from my SPH-D700 using xda premium
I'd say the future lies in more efficient use of processors. Right now, Android is still far from optimized on multi-core processor-equipped devices. Project Butter is the start of a great movement by Google to optimize the operating system. Hopefully it spreads out to other OEMs and becomes the main focus for Android development.
Improving and optimizing current processors is the way hardware companies should go.
In my opinion, processor development is out running battery development. Optimized processors could reduce power consumption while preserving excellent speed and usability.
Sent from my Transformer TF101 using Tapatalk 2
building processors on more efficient ARM architectures is going to be the way to go from what I see......throwing four less efficient cores at a problem is the caveman method to dealing with it.....looking at you Samsung Exynos Quad based on tweaked A9 cores.....
the A15 based Qualcomm S4 Krait is more efficient on a clock for clock core for core basis and once the software catches up and starts using the hardware in full capacity, less more efficient cores will be preferred
I dont see anything beyond quads simply because they havent even scratched the surface of what can be done with a modern dual core processor yet.......throwing more cores at it only makes excuses for poor code.....i can shoot **** faster than water with a big enough pump......but that doesn't mean that's the better solution
We don't need more cores! Having more than 2 cores will not make a difference so quad cores are a waste of space in the CPU die.
Hyperthreading, duh.
More ram. Got to have the hardware before the software can be made to use it.
With the convergence of x86 into the Android core and the streamlining of low-power Atom CPUs, the logical step would be to first optimize the current software base for multi-core processors before marketing takes over with their stupid x2 multiplying game...
Not long ago, a senior Intel exec went on record saying that today, a single core CPU Android smartphone is perhaps better overall performing (battery life, user experience, etc) than any dual/quad-core CPU. Mind you, these guys seldom if ever stick out their neck with such bold statements, especially when not pleasing to the ear...
For those interested, you can follow this one (of many) articles on the subject: http://www.zdnet.com/blog/hardware/intel-android-not-ready-for-multi-core-cpus/20746
Android needs to mature, and I think it actually is. With 4.1 we see the focus drastically shifted to optimization, UX and performance with *existing/limited* resources. This will translate to devices beating all else in battery life, performance and graphics but since it was neglected in the first several iterations, it is likely we see 4.0 followed by 4.1 then maybe 4.2 before we hear/see the 5.0 which will showcase maturity and evolution of the experience.
Just my 2c. :fingers-crossed:

iPad 4 vs 5250 (Nexus 10 Soc) GLBenchmark full results. UPDATE now with Anandtech!!

XXXUPDATEXXX
Anandtech have now published the perfromance preview of the Nexus 10, lets the comparison begin!
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Well, the first full result has appeared on GLBenchmark for the iPad4, so I have created a comparison with the Samsung Arndale board, which uses exactly the same SoC as the Nexus 10, so will be very close in performance to Google's newest tablet. GLBenchmark, as its name suggests test Open GL graphics perrformance, which is important criteria for gaming.
Which device wins, click the link to find out.
http://www.glbenchmark.com/compare....ly=1&D1=Apple iPad 4&D2=Samsung Arndale Board
If you're really impatient, the iPad 4 maintains it lead in tablet graphics, the Nexus 10 may performance slightly better in final spec, but the underlying low-level performance will not change much.
I've also made a comparison between the iPad 3 & 4.
Interestingly the in-game tests GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) :, which is run independent of native screen resolution show the following
iPad 4: 48.6 FPS
iPad 3: 25.9 FPS
5250 : 33.7 FPS
So the iPad is twice as fast as its older brother, the Exynos will probably score nearer 40 FPS in final spec, with new drivers and running 4.2, the board runs ICS, however Jelly Bean did not really boost GL performance over ICS. What is interesting is that iPad 4, whose GPU is supposed to clocked at 500 MHz vs 250 MHz in the iPad 3 does not perform twice as fast in low-level test.
Fill Rate, triangle throughtput, vertex output etc is not double the power of the iPad 3, so although the faster A6 cpu helps, I reckon a lot of the improvement in the Egypt HD test is caused by improved drivers for the SGX 543 MP4 in the Pad 4. The Galaxy S2 received a big jump in GL performance when it got updated Mali drivers, so I imagine we should see good improvements for the T604, which is still a new product and not as mature as the SGX 543.
http://www.glbenchmark.com/compare....tified_only=1&D1=Apple iPad 4&D2=Apple iPad 3
I'd imagine the new new iPad would take the lead in benchmarks for now as it'll take Sammy and Google some time to optimize the beast, in the end however actual app and user interface performance is what matters, and reports are overwhelmingly positive on the Nexus 10.
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
Click to expand...
Click to collapse
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Damn..now I have to get an iPad.
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
Click to expand...
Click to collapse
In areas where the Mali 400 lacked performance, like fragment and vertex lit triangle output T604 is comfortably 5 x the performance, whereas in these low-level tests iPad is not a concrete 2x the power of iPad 3, but achieves twice the FPS in Egypt HD than its older brother. I suspect drivers are a big factor here, and Exynos 5250 will get better as they drivers mature.
hot_spare said:
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
Click to expand...
Click to collapse
I agree with most of what you have said. On the GPixel figure this is like ATI GPU teraflops figures always being much higher than Nvidia, in theory with code written to hit the device perfectly you might see that those high figures, but in reality the Nvidia cards with lower on paper numbers equaled or beat ATI in actual game FPS. It all depends on whether the underlying architecture is as efficient in real-world tests, vs maximum technical numbers that can't be replicated in actual game environments.
I think the current resolution of the iPad / Nexus 10 is actually crazy, and will would see prettier games with lower resolutions, the amount of resources needed to drive those high MP displays, means lots of compromises will be made in terms of effects / polygon complexity etc to ensure decent FPS, especially when you think that to drive Battlefield 3 at 2560 x 1600 with AA and high textures, requires a PC that burn 400+ watts of power, not a 10 watt SoC.
Overall when we consider that Nexus 10 has twice the RAM for game developers to use and faster CPU cores, games should look equally as nice as both, the biggest effect will be the level of support game developers provide for each device, the iPad will probably be stronger i that regard. Nvidia was able to coax prettier games out of Tegra 3 through developer support, hopefully Google won't forget the importance of this.
What's the point of speculation? Just wait for the device to be released and run all the test you want to get confirmation on performance. Doesn't hurt to wait
BoneXDA said:
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Click to expand...
Click to collapse
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
I can't wait to see this Exynos 5250 in a 2.0ghz quad-core variant in the semi near future... Ohhhh the possibilities. Samsung has one hell of a piece of silicon on their hand.
Chrome
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Turbotab said:
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Click to expand...
Click to collapse
True.. Chrome on mobile is still not upto desktop level yet. I believe it's v18 or something, right? The stock browser would have much better result in SunSpider/Browsermark. The N4 numbers looks even worse. Somewhere the optimizations isn't working.
The GLbenchmark tests are weird. Optimus G posts much better result than N4 when both are same hardware. It infact scores lower than Adreno 225 in some cases. This is totally whacked.
For N10, I am still wondering about fill rate. Need to check what guys say about this.
Is it running some debugging code in the devices at this time?
Turbotab said:
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
Click to expand...
Click to collapse
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
hung2900 said:
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Click to expand...
Click to collapse
I have to disagree, this is from ARM's info site.
The ARM Cortex-A15 MPCore processor has an out-of-order superscalar pipeline with a tightly-coupled low-latency level-2 cache that can be up to 4MB in size. The Cortex-A15 processor implements the ARMv7-A architecture.
The ARM Cortex-A9 processor is a very high-performance, low-power, ARM macrocell with an L1 cache subsystem that provides full virtual memory capabilities. The Cortex-A9 processor implements the ARMv7-A architecture and runs 32-bit ARM instructions, 16-bit and 32-bit Thumb instructions, and 8-bit Java bytecodes in Jazelle state.
http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.set.cortexa/index.html
Keion said:
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
Click to expand...
Click to collapse
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Harry GT-S5830 said:
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Click to expand...
Click to collapse
Indeed it does,but not in offscreen testing, where Anand made his proclamation.
Sent from my iPad Mini using Tapatalk
Hemlocke said:
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Click to expand...
Click to collapse
Nah, I think we can beat that too.
Drivers + OC.

Categories

Resources