running full speed interesting observation - Asus Eee Pad Transformer Prime

OK I've got mine on normal mode, and this kind of confirms my original thought that the 500mhz 5th core is clocked to low. I find the pad actually speeds up when I have multiple items in my recently run tab! If my understanding of the way it works these programs are still running in the background right? Then it starts kicking in the other 4 and not just running on the 5th at 500mhz! I really think we'd see a speed boost if we can get that 5th core over 500. Yes its supposed to save battery life but I really don't think 500 is fast enough to run on its own. You're thoughts and observations?

markimar said:
OK I've got mine on normal mode, and this kind of confirms my original thought that the 500mhz 5th core is clocked to low. I find the pad actually speeds up when I have multiple items in my recently run tab! If my understanding of the way it works these programs are still running in the background right? Then it starts kicking in the other 4 and not just running on the 5th at 500mhz! I really think we'd see a speed boost if we can get that 5th core over 500. Yes its supposed to save battery life but I really don't think 500 is fast enough to run on its own. You're thoughts and observations?
Click to expand...
Click to collapse
ill check on this when i get home. this issue im assuming is with honeycomb itself. we would assume that ICS would properly use those cores
Sent from my Samsung Galaxy S II t989

i don't have it yet (mine gets delivered on wed), but what you observed makes perfect sense. Can they change it to run on say an 800 MHZ constant "down" to 500MHZ when doing the most simple tasks? obviously i to do not believe that 500MHZ will be sufficient at all times to do screen scrolling and such on it's own.
I'm really hoping that the few performance issues people are seeing is resolved in firmware updates and a tegra 3 optimized version of ICS. Maybe asus/nvidia needs to do more tweaking to HC before the ICS build is pushed if it will take a while for ICS to arrive to the prime (past january).

The cores are optimized just fine. They kick in when rendering a web page or a game, but go idle and use the 5th core when done. Games always render.

ryan562 said:
ill check on this when i get home. this issue im assuming is with honeycomb itself. we would assume that ICS would properly use those cores
Sent from my Samsung Galaxy S II t989
Click to expand...
Click to collapse
Nothing's changed over HC in the way ICS uses h/w acceleration. And I'd assume apps using h/w acceleration do so via calls to the OS, not to the chip directly. So it appears what you've got is what you're going to get.
---------- Post added at 06:59 PM ---------- Previous post was at 06:55 PM ----------
markimar said:
OK I've got mine on normal mode, and this kind of confirms my original thought that the 500mhz 5th core is clocked to low. I find the pad actually speeds up when I have multiple items in my recently run tab! If my understanding of the way it works these programs are still running in the background right? Then it starts kicking in the other 4 and not just running on the 5th at 500mhz! I really think we'd see a speed boost if we can get that 5th core over 500. Yes its supposed to save battery life but I really don't think 500 is fast enough to run on its own. You're thoughts and observations?
Click to expand...
Click to collapse
Do you have Pulse installed? A bunch of people using it were reporting stuttering where their lower powered devices aren't. If you run it at full speed, does it stutter? One of the hypothesis is that it's the core's stepping up and down that's causing the stuttering.

BarryH_GEG said:
Nothing's changed over HC in the way ICS uses h/w acceleration. And I'd assume apps using h/w acceleration do so via calls to the OS, not to the chip directly. So it appears what you've got is what you're going to get.
Click to expand...
Click to collapse
Also, correct me if I'm wrong, but I don't think that the OS knows about the fifth core? I believe the chip's own scheduler manages the transition between the quad-core and the companion core, not the Android scheduler.

Mithent said:
Also, correct me if I'm wrong, but I don't think that the OS knows about the fifth core? I believe the chip's own scheduler manages the transition between the quad-core and the companion core, not the Android scheduler.
Click to expand...
Click to collapse
That's the way I'd guess it would work. I don't think Android addresses different chips differently. I'd assume it's up to the SoC to manage the incoming instructions and react accordingly. If Android was modified for dual-core, I don't think it diffentiates between the different implementations of dual-core chips. Someone with more h/w experience correct me if I'm wrong. Also, does anyone know if the chip manufacturer can add additional API's that developers can write to directly either instead of or in parallel with the OS? I ask because how can a game be optimized for Tegra if to the OS all chips are treated the same?

I tried out the power savings mode for a while.it seemed to perform just fine. Immediate difference is that it lowers the contrast ratio on display. This happens as soon as you press the power savings tab. Screen will look like brightness dropped a bit but if you look closely, you'll see it lowered the contrast ratio. Screen still looks good but not as sharp as in other 2 modes. UI still seems to preform just fine. Plus I think the modes doesn't affect gaming or video playback performance. I read that somewhere, either anandtech or Engadget. When watching vids or playing games, it goes into normal mode. So those things won't be affected no matter what power mode you in, I think..lol
I was thinking of starting a performance mode thread. To see different peoples results and thoughts on different power modes. I read some people post that they just use it in power/battery savings mode. Some keep it in normal all the time. Others in balanced mode. Would be good to see how these different modes perform in real life usage. From user perspective. I've noticed, so far, that In balanced mode, battery drains about 10% an hour. This is with nonstop use including gaming, watching vids, web surfing, etc. now in battery savings mode, it drains even less per hour. I haven't ran normal mode long enough to see how it drains compared to others. One thing though, web surfing drains battery just as fast as gaming.

BarryH_GEG said:
I ask because how can a game be optimized for Tegra if to the OS all chips are treated the same?
Click to expand...
Click to collapse
I hate quoting myself but I found the answer on Nvidia's website. Any otimizations are handled through OpenGL. So games written to handle additional calls that Teg2 can support are making those calls through OpenGL with the OS (I'm guessing) used as a pass-through. It would also explain why Tegra optimized games fail on non-Teg devices because they wouldn't be able process the additional requests. So it would appear that Teg optimization isn't being done through the OS. Again, correct me if I'm wrong.

BarryH_GEG said:
That's the way I'd guess it would work. I don't think Android addresses different chips differently. I'd assume it's up to the SoC to manage the incoming instructions and react accordingly. If Android was modified for dual-core, I don't think it diffentiates between the different implementations of dual-core chips.
Click to expand...
Click to collapse
I did some research on it; here's what Nvidia say:
The Android 3.x (Honeycomb) operating system has built-in support for multi-processing and is
capable of leveraging the performance of multiple CPU cores. However, the operating system
assumes that all available CPU cores are of equal performance capability and schedules tasks
to available cores based on this assumption. Therefore, in order to make the management of
the Companion core and main cores totally transparent to the operating system, Kal-El
implements both hardware-based and low level software-based management of the Companion
core and the main quad CPU cores.
Patented hardware and software CPU management logic continuously monitors CPU workload
to automatically and dynamically enable and disable the Companion core and the main CPU
cores. The decision to turn on and off the Companion and main cores is purely based on current
CPU workload levels and the resulting CPU operating frequency recommendations made by the
CPU frequency control subsystem embedded in the operating system kernel. The technology
does not require any application or OS modifications.
Click to expand...
Click to collapse
http://www.nvidia.com/content/PDF/t...e-for-Low-Power-and-High-Performance-v1.1.pdf
So it uses the existing architecture for CPU power states, but intercepts that at a low level and uses it to control the companion core/quad-core switch?
Edit: I wonder if that means that tinkering with the scheduler/frequency control would allow the point at which the companion core/quad-core switch happens to be altered? If the OP is correct, this might allow the companion core to be utilised less if an increase in "smoothness" was desired, at the cost of some battery life?

Mithent said:
I wonder if that means that tinkering with the scheduler/frequency control would allow the point at which the companion core/quad-core switch happens to be altered? If the OP is correct, this might allow the companion core to be utilised less if an increase in "smoothness" was desired, at the cost of some battery life?
Click to expand...
Click to collapse
So what we guessed was right. The OS treats all multi-cores the same and it's up to the chip maker to optimize requests and return them. To your point, what happens between the three processors (1+1x2+1x2) is black-box and controlled by Nvidia. To any SetCPU type program it's just going to show up as a single chip. People have tried in vain to figure how to make the Qualcomm dual-core's act independently so I'd guess Teg3 will end up the same way. And Nvidia won't even publish their drivers so I highly doubt they'll provide any outside hooks to control something as sensitive as the performance of each individual core in what they're marketing as a single chip.

[/COLOR]
Do you have Pulse installed? A bunch of people using it were reporting stuttering where their lower powered devices aren't. If you run it at full speed, does it stutter? One of the hypothesis is that it's the core's stepping up and down that's causing the stuttering.[/QUOTE]
I have been running mine in balanced mode, have had pulse installed since day one, no lag or stuttering in anything. games, other apps work fine.

Well my phones when clocked at 500 so I wouldn't be surprised
Sent from my VS910 4G using xda premium

Related

[Q] Are both cores used all the time?

Just as the question states. I know the second core will sleep when not needed but say you launch an app, does the second core help load the app? The reason I ask is because I'm curious about the raw speed difference between the atrix and inspire. Now compairing the inspire running at 1.8 and the atrix seemingly stuck at 1 per core (I'm not saying the atrix wont ever be OCed but I'm just talking about what's currently available). I'm just curious if the second core will help the first with tasks. If it doesn't would that make the inspire technically way faster (obviously battery life may be an issue but this isn't a battery compairo)?
Thanks for any insight
I think you should start by knowing that overclocking ARM prroccessors gives little yield.
XOOM at 1.5 ghz scores only 500 better than a non-overclocked xoom on quadrant.
I'm going to try and simplify the answer for you.
Will BOTH cores be used? Maybe. First off, is the app itself optimized for dual core, or does it even need dual core / multithreaded capability.
Secondly, and I think more importantly, what is the rest of the phone doing. So, let's say you fire up your favorite app, the phone is still doing stuff in the background. Maybe it's checking email. Maybe Google Latitude is checking your location and updating. The point is - the other core will still be around to offload this work.
Now, WILL it go to the other core. Maybe. Maybe not. I do work on some big Sun machines, and have seen them use one or two out of 64 cores, even with massive loads and each core being used 100%, it refused to balance the load amongst CPU's.
Hope this helps.
mister_al said:
I'm going to try and simplify the answer for you.
Will BOTH cores be used? Maybe. First off, is the app itself optimized for dual core, or does it even need dual core / multithreaded capability.
Secondly, and I think more importantly, what is the rest of the phone doing. So, let's say you fire up your favorite app, the phone is still doing stuff in the background. Maybe it's checking email. Maybe Google Latitude is checking your location and updating. The point is - the other core will still be around to offload this work.
Now, WILL it go to the other core. Maybe. Maybe not. I do work on some big Sun machines, and have seen them use one or two out of 64 cores, even with massive loads and each core being used 100%, it refused to balance the load amongst CPU's.
Hope this helps.
Click to expand...
Click to collapse
Yea that's exactly like I figured, I was kinda going off Windows/Intel multi core setup. Even after dual+cores have been out for quite some time 95% of programs made still don't use more than one core (Most of those remaining 5% being very CPU intense programs PS, Autocad ect.). But I get what you mean, the one core will be dedicated to what your doing and not sharing cycles with anything else because core 2 is working on whatever pops up. So basically the Atrix might be a little slower at doing things BUT it will always stay the same speed with less/no bog.
Techcruncher said:
I think you should start by knowing that overclocking ARM prroccessors gives little yield.
XOOM at 1.5 ghz scores only 500 better than a non-overclocked xoom on quadrant.
Click to expand...
Click to collapse
So you're saying Quadrant suck as it does with most phones or OCing the Xoom (and Atrix) wont really do much?
I already built an apk for testing CPU usage on both processors... When I get some free time, I'm going to turn it into a widget... Here's what I noticed:
Because of the current OS and less dual core support for apps, the phone kind of kicks certain tasks into using the 2nd processor. The APK i built reads the '/proc/stat' file and i've noticed that when the 2nd processor is being used it actually shows up in the file as 'cpu1'. However, when it's not being used the 'cpu1' line does not exist and you can default the 2nd processor usage to 0%. It seems like performing core OS tasks (like installing apps) kick the 2nd processor into use, which is what you can expect since froyo supports dual cores.
Like everyone says, I'd expect to see more dual core usage on 2.3/2.4 (whichever motorola gives) and when more apps are designed to kick certain threads onto the 2nd processor.

asynchronous dual core vs others

I have a question about the 3D's dual core that I'd like more clarification on the vague answers I'm getting by searching this site and google. So I've read that the core is asynchronous so basically meaning the second core doesn't do much work unless needed as others like the tegra 2 and exynos have both cores running or something similar to that, and that this is affecting the benchmark scores. I also read that one would basically double the score of the 3D to get a more accurate reading. Can anyone confirm or further explain this?
Yes, asynchronous is when something operates on another thread whereas the main thread is still available for operating. This allows for better performance in terms of managing tasks. Now just because it doesn't score high on a benchmark, it doesn't mean it is going to perform. Also this allows for better performance for the battery.
I haven't slept for the past 12 hours so if this doesn't help you, just let me know and I will fully elaborate on how the processor will operate on the phone. Now time for bed :'(
In short, asynchronous operation means that a process operates independently of other processes.
Think of transferring a file. A separate thread will utilized for doing so. You will then be able to do background things such as playing with the UI, such as Sense since you will be using the main thread. If anything were to happen to the transferring file (such as it failing), you will be able to cancel it because it is independent on another thread.
I hope this makes sense man, kind of tired. Now I'm really going to bed.
Sent from my PC36100 using XDA App
To be more specific by asynchronous they mean that each core can run at different clock speeds. Core 1 could be at 1.2 ghz while core 2 is at 200 mhz. Most multi core processors are synchronous meaning all the cores are running at the same speed.
donatom3 said:
To be more specific by asynchronous they mean that each core can run at different clock speeds. Core 1 could be at 1.2 ghz while core 2 is at 200 mhz. Most multi core processors are synchronous meaning all the cores are running at the same speed.
Click to expand...
Click to collapse
^This too
Sent from my PC36100 using XDA App
I was also very curious to learn a little more about the async cores and how it differes from a standard "Always-On" dual core arctechiure.
Thh first page/video I found talks about the SnapDragon core specifically.
http://socialtimes.com/dual-core-snapdragon-processor-qualcomm-soundbytes_b49063
From what I've gathered, it comes down to using the second core and thus more power, only when needed. Minimizing voltage and heat to preserve battery life.
The following video goes into similar and slightly deeper detail about the processor specifically found in the EVO 3D. The demo is running a processor benchmark with a visual real time usage of the two cores. You can briefly see how the two cores are trading off the workload between each other. It was previously mentioned somewhere else on this forum, but I believe by seperating a workload between two chips, the chip will use less power across the two chips vs putting the same workload on a sinlge chip. I'm sure someone else will chime in with some additional detail. Also, after seeing some of these demos, I'm inclined to think that the processor found in the EVO 3D is actually stable at 1.5 but has been underclocked to 1.2 to conserve battery. Only time spent within our hands will tell.
Another demo of the MSM8660 and Adreno 220 GPU found in the EVO 3D. Its crazy to think we've come this far for mobile phone technology.
What occurred to me is how complex Community ROMs for such a device may become with the addition of Video Drivers that may continue to be upgraded and improved (think early Video Card tweaks for PC). Wondering how easy/difficult it will be to get our hands on them, possibly through extraction of updated stock ROMs.
EDIT: As far as benchmarks are concerned, I blame the inability of today's bench marking apps to consider async cores or properly utilize them during testing to factor the over all score. Because the current tests are most likely to be spread across cores which favors efficiency, the scores are going to be much lower than what the true power and performance of the chips can produce. I think of it as putting a horsepower governor on a Ferrari.
thanks for the explanation everyone
The best demonstration is in the first video posted, notice when Charbax looks at the monitor. There on the top right are the frequencies of the two cores, and you'll notice the both of them jumping around a lot, independent of the other. Using the cores "on-demand" only when needed ends up saving a lot of battery power, but doesn't give you any performance loss.
Harfainx said:
The best demonstration is in the first video posted, notice when Charbax looks at the monitor. There on the top right are the frequencies of the two cores, and you'll notice the both of them jumping around a lot, independent of the other. Using the cores "on-demand" only when needed ends up saving a lot of battery power, but doesn't give you any performance loss.
Click to expand...
Click to collapse
Actually I was thinking that not just the battery savings but there could be a performance gain. Think of this if the manufacturer knows they only have to clock one core up to speed when needed they can be more aggressive about their timings and have the core clock up faster than a normal dual core would since they know they don't have to clock up both processors when only one needs the full speed.
I wonder if the drop to 1.2 GHz also serves to keep heat under control. It might not just be battery savings, maybe the small case of a phone doesn't allow for proper cooling to hit 1.5 safely.
I'd love to see some confirmation that the asynchronous nature of this chipset is what's responsible for the seemingly lackluster benchmarking.
mevensen said:
I wonder if the drop to 1.2 GHz also serves to keep heat under control. It might not just be battery savings, maybe the small case of a phone doesn't allow for proper cooling to hit 1.5 safely.
I'd love to see some confirmation that the asynchronous nature of this chipset is what's responsible for the seemingly lackluster benchmarking.
Click to expand...
Click to collapse
The "horrible" benchmark scores are simply due to the tests inability to consider async core performance. Wait till the tests are able to take this into consideration.
Sent from my HERO200 using XDA Premium App
RVDigital said:
The "horrible" benchmark scores are simply due to the tests inability to consider async core performance. Wait till the tests are able to take this into consideration.
Sent from my HERO200 using XDA Premium App
Click to expand...
Click to collapse
I went through all of your links, I didn't see anything that confirms that the benches are somehow affected by the asynchronous nature of the chipset. It's not that I don't believe you, I actually had that same theory when the benches first came out. I just don't have any proof or explanation of it. Do you have a link that provides more solid evidence that this is the case?
NVIDIA actually tells a different story (of course)
http://www.intomobile.com/2011/03/24/nvidia-tegra-2-outperforms-qualcomm-dualcore-1015/
AnandTech's article does explain some of the differences
http://www.anandtech.com/show/4144/...gra-2-review-the-first-dual-core-smartphone/4
It appears that Snapdragon (Scorpion) will excel in some tasks (FPU, non-bandwith constrained applications), but will fall short in others .
I'm pretty sure none of the benchmark apps have even been updated past the release of the sensation so yeah....How could they update the app to use the asynchronus processors the if the only phones to use them have only recently been released.
Sent from my zombified gingerbread hero using XDA Premium App
I had the G2x for like 3 days and never got to root. Poor service where I live. But could the cores be set to a specific frequency independently when rooted like computers?
tyarbro13 said:
I had the G2x for like 3 days and never got to root. Poor service where I live. But could the cores be set to a specific frequency independently when rooted like computers?
Click to expand...
Click to collapse
Yea, if someone were to develop an app for that. I do not see why not.
Sent from my PC36100 using XDA App
Hmm...
If a program such as Smart bench (which takes advantage of dual cores) is stressing both cores to 1.2ghz then regardless of if both cores are active or not the bench will be accurate.
I would rather NOT have asyncronus cores as there would be lag during frequency changes...
Ex:
2 cores running at 500mhz vs 1 core @ 1ghz and other not active.
The 2 cores will produce less heat and use less energy...
Maedhros said:
Hmm...
If a program such as Smart bench (which takes advantage of dual cores) is stressing both cores to 1.2ghz then regardless of if both cores are active or not the bench will be accurate.
I would rather NOT have asyncronus cores as there would be lag during frequency changes...
Ex:
2 cores running at 500mhz vs 1 core @ 1ghz and other not active.
The 2 cores will produce less heat and use less energy...
Click to expand...
Click to collapse
There dual, it would be better for them to run asynchronous. Not only that, but it is a phone so there will be no lag between frequency changing. 2 Cores running at 500mhz will perform better than 1 core at 1ghz.
Sent from my PC36100 using XDA App
tyarbro13 said:
I had the G2x for like 3 days and never got to root. Poor service where I live. But could the cores be set to a specific frequency independently when rooted like computers?
Click to expand...
Click to collapse
This is something that the hardware needs to be capable of. Software can only do so much. As far as I've seen Tegra isn't capable of it.
I read the anandtech article and I came with conclusion that everyday task you might not see the difference between the two and while tegra2 might bench higher. The main thing people dont talk about is the GPU. Adreno 220 is a powerhouse GPU, it will probably stand strong when tegra 3 comes out.
DDiaz007 said:
There dual, it would be better for them to run asynchronous. Not only that, but it is a phone so there will be no lag between frequency changing. 2 Cores running at 500mhz will perform better than 1 core at 1ghz.
Sent from my PC36100 using XDA App
Click to expand...
Click to collapse
Huh... what are u saying? Sorry dont understand... On one hand you say asynchronous is better and on the other ur saying 2 cores @ 500 will work better?
nkd said:
I read the anandtech article and I came with conclusion that everyday task you might not see the difference between the two and while tegra2 might bench higher. The main thing people dont talk about is the GPU. Adreno 220 is a powerhouse GPU, it will probably stand strong when tegra 3 comes out.
Click to expand...
Click to collapse
What?!?
Andreno 220 is a horrible GPU. AT BEST it is equal to the GPU in the Original SGS.
The reason benches are so different is because Qualcomm has made NO improvements in the CPU. Desire HD CPU is the same as Sensations. While... SGS2 + Tegra have IMPROVED CPUs.
Arm 7 vs arm 9?
Maedhros said:
Huh... what are u saying? Sorry dont understand... On one hand you say asynchronous is better and on the other ur saying 2 cores @ 500 will work better?
What?!?
Andreno 220 is a horrible GPU. AT BEST it is equal to the GPU in the Original SGS.
The reason benches are so different is because Qualcomm has made NO improvements in the CPU. Desire HD CPU is the same as Sensations. While... SGS2 + Tegra have IMPROVED CPUs.
Arm 7 vs arm 9?
Click to expand...
Click to collapse
Dude go back to sleep. You have no clue what you are talking about.
Sent from my PC36100 using XDA Premium App

Proper Dual Core Usage?

Just an epiphany..
The Evo3D has an asynchronous dual core, meaning as of right now, it unloads tasks to the other core when the 1st is being over burdened with things to do (am I essentially right on that one?)
So, basically, the second core only pops in to say hello once in a while right?
with they way that cpu's work, the harder you push it the higher amount of energy it needs and the higher amount of heat it throws off and that begins to multiply. So, overclocking the cpu even .2ghz increases heat and power usage by 1.3x rather than just 1.0x (not actual values, just stating it is MORE than a slight bump the higher it clocks.)
So, wouldn't running both cores more often than not, keeping them from reaching their peak speeds be way more beneficial to battery life than letting one core hit it's peak and than have the slacker core go "oh, dude... yeah, I'll take bag of breads while you carry the remaining rack of pop, bag of canned veggies, frozen pizzas and laundry detergent."
I dunno, just a thought.. now that we have s-OFF, is there any way we can drop the threshold so the second core will kick on sooner, rather than later?
Before the phone came out I read somewhere that one core operates the android system and radios while the other does user initiated tasks.
Im not really sure but that makes sense to me. It would be nice to have a definite answer on this though.
It has asynchronous cores lol. Not asymmetrical.
doctor ladd said:
Before the phone came out I read somewhere that one core operates the android system and radios while the other does user initiated tasks.
Im not really sure but that makes sense to me. It would be nice to have a definite answer on this though.
Click to expand...
Click to collapse
Nah..but in a synchonous cores both run at the same time. Asychronous cores one handles all the tasks and stuff, and only calls on the second one as needed. I don't know if this can be changed though. This is how the 3vo is.
Asynchronous just means each core can run at a different speed. It's much more efficient.
Product F(RED) said:
It has asynchronous cores lol. Not asymmetrical.
Click to expand...
Click to collapse
*facepalm* god... Yeah, I should have proof read it. Duh.
Anyways. from what I have read the Evo3D only really uses the second core when the first is over taxed. That's what I was getting at. Wouldn't it still be more effective to call on the other core sooner?
Are We Not Phones?
We Are D3VO!
Are we not phones?!
3.DeE.V.O.!!
Here, this is from my answer to another persons question.
View Full Version : asynchronous dual core vs others
18th June 2011, 08:46 AM
I have a question about the 3D's dual core that I'd like more clarification on the vague answers I'm getting by searching this site and google. So I've read that the core is asynchronous so basically meaning the second core doesn't do much work unless needed as others like the tegra 2 and exynos have both cores running or something similar to that, and that this is affecting the benchmark scores. I also read that one would basically double the score of the 3D to get a more accurate reading. Can anyone confirm or further explain this?
18th June 2011, 09:18 AM
Yes, asynchronous is when something operates on another thread whereas the main thread is still available for operating. This allows for better performance in terms of managing tasks. Now just because it doesn't score high on a benchmark, it doesn't mean it is going to perform. Also this allows for better performance for the battery.
I haven't slept for the past 12 hours so if this doesn't help you, just let me know and I will fully elaborate on how the processor will operate on the phone. Now time for bed :'(
In short, asynchronous operation means that a process operates independently of other processes.
Think of transferring a file. A separate thread will utilized for doing so. You will then be able to do background things such as playing with the UI, such as Sense since you will be using the main thread. If anything were to happen to the transferring file (such as it failing), you will be able to cancel it because it is independent on another thread.
I hope this makes sense man, kind of tired. Now I'm really going to bed.
Like my help? Thank me

dual core vs quad core

So I've been lurking on the prime's forums for a while now and noticed the debate of whether the new qualcomm dual core will be better or the current tegra 3 that the prime has. Obviously if both were clocked the same then the tegra 3 would be better. Also I understand that the gpu of the tegra 3 is better. However, for normal user (surf web, play a movie, songs etc) isn't dual core at 1.5 ghz better in that an average user will rarely use more 2 cores? The way I understand it each core is able to handle 1 task so in order to activate the 3rd core you would have to have 3 things going on at the same time? Could someone please explain this to me?
First of all, the tegra 3 can go up to 1.6 ghz. Secondly, all 4 cores can be utilized by a multi threading app. Lastly, battery is great on tegra III due to teh companion core.
jdeoxys said:
First of all, the tegra 3 can go up to 1.6 ghz. Secondly, all 4 cores can be utilized by a multi threading app. Lastly, battery is great on tegra III due to teh companion core.
Click to expand...
Click to collapse
But the native clock for that qualcomm would be 1.5 meaning o/c can take it higher. Also doesn't being dual core compared to quad core give it an edge in battery? You do bring up a good point with the multi threading app. Also to clarify I am not standing up for the qualcomm chip or putting down the tegra 3 just trying to get things straight.
Thanks
Hey I'm the ....idiot aboard here....lol
But the tegra 3 has a companion core, being a fifth core, to take over when the tablet is not stressed. Thus saving the battery.
I am just repeating what I have read, I have no knowledge of how it all works. I guess that is how we can get better battery life.
Just trying to help the OP, maybe some one way smarter can chime in. Shouldn't be hard....lol
Quad core is better by far. On low level tasks, simple things, and screen off/deep sleep the companion core takes over. Meaning its running on a low powered single core. This companion core only has a Max of 500Mhz speed. So when in deep sleep or low level tasks, companion core alone is running everything at only 102mhz -500Mhz. Most of the time on the lower end. Therefore tegra3 has the better battery life since all it's low power level tasks are ran by a single low powered companion core. That's 1 low powered core compared to 2 high powered cores trying to save battery. Quad core better all around. We hsvent even begun real overclocking yet. The 1.6Ghz speed was already in the kernel. So if you rooted n using vipercontrol or ATP tweaks or virtuous rom, we can access those speeds at any time. Once we really start overclocking higher than 1.6Ghz we will have an even more superior advantage. Anyone knows 4 strong men are stronger than 2..lol. tegra3 and nvidia is the future. Tegra3 is just the chip that kicked down the door on an evolution of mobile chip SoC.
---------- Post added at 10:13 PM ---------- Previous post was at 10:06 PM ----------
If you really want to learn the in and outs of tegra3, all the details, and how its better than any dual core, check out this thread I made. I have a whitepaper attachment in that thread you can download and read. Its made by nvidia themselves and goes into great detail on tegra3 by the people who created it, Nvidia. Check it out.
http://forum.xda-developers.com/showthread.php?t=1512936
aamir123 said:
But the native clock for that qualcomm would be 1.5 meaning o/c can take it higher. Also doesn't being dual core compared to quad core give it an edge in battery? You do bring up a good point with the multi threading app. Also to clarify I am not standing up for the qualcomm chip or putting down the tegra 3 just trying to get things straight.
Thanks
Click to expand...
Click to collapse
The maximum clock speed isn't all that important, since during tasks like web browsing, watching videos & movies and listening to music you will never push the processor to its highest available clock speed anyway. All mobile devices will underclock their processors so that you rarely have unused clock cycles eating up battery life. So all things being relatively equal performance would be about the same between both tablets during these types of lightweight tasks.
If you have a lot of background processes running, then the quad-core system might have an edge in performance since theoretically different tasks can be pushed off to different processors. However this use case is rarely found in Android. You might have an app checking weather or syncing photos in the background, or you might have music playing while you web surf, but those are generally fairly lightweight tasks that usually won't test the processor performance of your device.
In tasks that will stress you processor, such as 3D gaming, then quad cores have a very large advantage over dual core systems, despite the slight difference in maximum clock speeds. In addition the Tegra 3 has a more powerful GPU than the new Qualcomm chip, which will definitely make a noticeable difference in gaming performance.
Now when it comes to ultra-low power tasks or when the tablet is on Standby, the Tegra 3 uses its "companion core" which has incredibly low power requirements, so it can continue to sync your email, twitter and weather updates for days (or weeks) while having very little impact on the Transformer Prime's battery.
So in short, the Tegra 3 is more likely to outperform the Qualcomm in situations where you actually need extra performance. In light tasks performance between the two should be about the same. Battery life is yet to be definitively determined, however the Tegra's 3 ultra-low power companion core should give it an edge when only doing light tasks or on standb.
Keep in mind, the Tegra 3 in the TF Prime has a maximum clock speed of 1300Mhz. One core has a maximum clock speed of 1400Mhz. If all things were equal, a difference of 100-200 Mhz n a 1Ghz+ processor is practically unnoticeable in daily usage.
almightywhacko said:
The maximum clock speed isn't all that important, since during tasks like web browsing, watching videos & movies and listening to music you will never push the processor to its highest available clock speed anyway. All mobile devices will underclock their processors so that you rarely have unused clock cycles eating up battery life. So all things being relatively equal performance would be about the same between both tablets during these types of lightweight tasks.
If you have a lot of background processes running, then the quad-core system might have an edge in performance since theoretically different tasks can be pushed off to different processors. However this use case is rarely found in Android. You might have an app checking weather or syncing photos in the background, or you might have music playing while you web surf, but those are generally fairly lightweight tasks that usually won't test the processor performance of your device.
In tasks that will stress you processor, such as 3D gaming, then quad cores have a very large advantage over dual core systems, despite the slight difference in maximum clock speeds. In addition the Tegra 3 has a more powerful GPU than the new Qualcomm chip, which will definitely make a noticeable difference in gaming performance.
Now when it comes to ultra-low power tasks or when the tablet is on Standby, the Tegra 3 uses its "companion core" which has incredibly low power requirements, so it can continue to sync your email, twitter and weather updates for days (or weeks) while having very little impact on the Transformer Prime's battery.
So in short, the Tegra 3 is more likely to outperform the Qualcomm in situations where you actually need extra performance. In light tasks performance between the two should be about the same. Battery life is yet to be definitively determined, however the Tegra's 3 ultra-low power companion core should give it an edge when only doing light tasks or on standb.
Keep in mind, the Tegra 3 in the TF Prime has a maximum clock speed of 1300Mhz. One core has a maximum clock speed of 1400Mhz. If all things were equal, a difference of 100-200 Mhz n a 1Ghz+ processor is practically unnoticeable in daily usage.
Click to expand...
Click to collapse
Wow! Thanks for taking the time for breaking it down for me like that! I understand exactly where your coming from and now have to agree.
demandarin said:
Quad core is better by far.
Click to expand...
Click to collapse
At least that is what Nvidia would like you to think.
The Tegra 3 uses an older ARM core for it's quad core design while Qualcomm uses their own ARM instruction set compatible core for their Krait S4 design. For most current benchmarks the Qualcomm Krait S4 dual core seems to outpace the Tegra 3 by quite a large margin. And of course Krait will be expanded to quad core later this year.
http://www.anandtech.com/show/5563/qualcomms-snapdragon-s4-krait-vs-nvidias-tegra-3
Dave_S said:
At least that is what Nvidia would like you to think.
The Tegra 3 uses an older ARM core for it's quad core design while Qualcomm uses their own ARM instruction set compatible core for their Krait S4 design. For most current benchmarks the Qualcomm Krait S4 dual core seems to outpace the Tegra 3 by quite a large margin. And of course Krait will be expanded to quad core later this year.
http://www.anandtech.com/show/5563/qualcomms-snapdragon-s4-krait-vs-nvidias-tegra-3
Click to expand...
Click to collapse
There's already another thread on what you just mentioned and the Krait claims were easily shot down. Tegra3 still a better chip overall. Plus krait gpu was subpar to tegra3. We have more links and stuff in other thread showing Prime still right up there
demandarin said:
There's already another thread on what you just mentioned and the Krait claims were easily shot down. Tegra3 still a better chip overall. Plus krait gpu was subpar to tegra3. We have more links and stuff in other thread showing Prime still right up there
Click to expand...
Click to collapse
As unlikely as that seems considering the slower cores that Nvidia uses, links to real benchmarks ( not self serving white papers ) would be appreciated. I have glanced at your Tegra3 thread but have not read it all the way through after I saw that it seemed to depend a lot on a white paper and not real comparison tests. It is true that the current GPU the Krait uses is not as fast as the one in the Tegra 3, but graphics is only one element of overall performance. The only benchmarks that I have seen Tegra beat out Krait on were benchmarks that emphasized more than two threads, and then not by much.
Dave_S said:
As unlikely as that seems considering the slower cores that Nvidia uses, links to real benchmarks ( not self serving white papers ) would be appreciated. I have glanced at your Tegra3 thread but have not read it all the way through after I saw that it seemed to depend a lot on a white paper and not real comparison tests. It is true that the current GPU the Krait uses is not as fast as the one in the Tegra 3, but graphics is only one element of overall performance. The only benchmarks that I have seen Tegra beat out Krait on were benchmarks that emphasized more than two threads, and then not by much.
Click to expand...
Click to collapse
Its not my tegra3 thread I'm talking about. I think its the Prime alternatives thread created by shinzz. We had a huge debate over it. More benchmarks n supporting arguments in that thread. Check it out if you get the chance.
demandarin said:
Its not my tegra3 thread I'm talking about. I think its the Prime alternatives thread created by shinzz. We had a huge debate over it. More benchmarks n supporting arguments in that thread. Check it out if you get the chance.
Click to expand...
Click to collapse
Thanks, Will do. Gotta run for a doctor appointment right now though.
I frankly think the power savings with the fifth core is mostly hype. According to many battery tests I've read online and my own experiences with my Prime, it doesn't get much different battery life from dual core tablets.
Quad core is better for future but problem for backwards compatibility... it's definitely good for tablet.
jedi5diah said:
Quad core is better for future but problem for backwards compatibility... it's definitely good for tablet.
Click to expand...
Click to collapse
Here is another benchmark that shows that there is a least one current dual core that can soundly beat the Nvida quad core at benchmarks that are not heavily multithreaded.
http://www.extremetech.com/computin...ragon-s4-has-the-competition-on-the-defensive
Buddy Revell said:
I frankly think the power savings with the fifth core is mostly hype. According to many battery tests I've read online and my own experiences with my Prime, it doesn't get much different battery life from dual core tablets.
Click to expand...
Click to collapse
No dual core android tablet battery last longer than an ipad1. My prime easily outlasts my Ipad in battery life. The battery hype is real. Tons of people here seeing 9-11hrs+ on a single charge with moderate to semi heavy use on balanced mode. Even longer on power savings mode.
demandarin said:
No dual core android tablet battery last longer than an ipad1. My prime easily outlasts my Ipad in battery life. The battery hype is real. Tons of people here seeing 9-11hrs+ on a single charge with moderate to semi heavy use on balanced mode. Even longer on power savings mode.
Click to expand...
Click to collapse
Really? I get 9-12 hours constant use on balanced. Plus 6 more with the dock.
Sent from my PG8610000 using xda premium
I think if Krait were to come out with quad core then it would beat out tegra 3 otherwise no. Also they are supposed to improve the chip with updated gpu to 3.xx in future releases. Also benchmarks have been proven to be wrong in the past so who knows? Not like benchmarks can determine real life performance, nor does the average user need that much power.
Companion core really does work
jdeoxys said:
Really? I get 9-12 hours constant use on balanced. Plus 6 more with the dock.
Sent from my PG8610000 using xda premium
Click to expand...
Click to collapse
Strange, we just started uni here (Australia) and I've been using my prime all day, showing it off to friends (to their absolute amazement!) showing off glowball, camera effects with eyes, mouth etc. 2 hours of lecture typing, gaming on the train, watched a few videos and an episode of community played music on speaker for about 40 mins, webbrowsed etc etc started using at lightly at 9 am (only properly at say 1:30 pm) and it's 10:00pm now and GET THIS!!:
72% battery on tablet and 41% on the dock. It's just crazy man. No joke, it just keeps going, I can't help but admit the power saving must be real :/
Edit: Whoops, I quoted the wrong guy, but you get the idea.
That's what I'm saying. Battery life on prime is great. Add a dock n battery life is sick!
I do agree a quad core variant of krait or S4 will give tegra3 a really good battle. Regardless I'm more than satisfied with power of tegra3. I'm not the type as soon as i see a newer or higher spec tab, ill feel like mines is useless or outdated. With have developement going hard now for this device. Just wait till the 1.8-2ghz+ overclock roms n kernels drop. Then we would even give new quad core higher speed chips a good run.
Above all of that, Android needs to developement more apps to take advantage of the more powerful chips like tegra3 and those that's upcoming. Software is still trying to catch up to hardware spec. Android apps haven't even all been made yet to take advantage of tegra2 power..yet lol. With nvidia/tegra3 we have advantage because developers are encouraged to make apps n games to take advantage of tegra3 power.
Regardless we all Android. Need to focus more on the bigger enemy, apple n IOS

Optimization help

Okay, so I'm REALLY anal about the speed of my phone, the slightest bit of stutter or lag from just the notification center itself really bothers me. I was wondering if someone could recommend some really good settings for my phone
I currently am running
JellyBam 6.3.0 (JB 4.2.2)
4Aces Kernel
I would like some good settings regarding governor, CPU Frequency, and any other things I can do including stuff in developer options, if that helps. Thanks!
It is likely that you will always have "some" degree of lag present in the note 1. Due in large part to our GPU. We are also limited in performance by our dual core CPU.
That being said, the closest to zero lag I've found, is using Flappjaxxx current JB AOSP build, (0225) combined with nova launcher, and his newest 3.0.9 ALT kernel.
Windows, transition, and animator settings to "off" in development settings.
Wheatley governor set to 384 min, 1.72 max.
system tuner app on system controls set to "recommended".
No over/undervolt.
forced GPU rendering in development settings.
These are my main settings, but yours will likely differ.
Happy tuning....g
^^Limited performance from "only a dual core" ...
Hardware is WAY ahead of software right now.
The second core is offline most of the time due to no load and developers of apps not fully understanding how to utilise multiple threads...
Adding more cores on top of an unused core ain't really gonna help much.
And yet we cant even stream a quality youtube video above 22 FPS, all the while the MSM8660 specs boast a capability of nearly 60 FPS with the Adreno 220 GPU.
So my question is, Are we seeing reduced performance from the CPU, or the GPU. It cant be all software, as we see the reductions when ranging software from GB to JB.
Drivers are in play of course, but I can't hardly imagine a piece of code so poorly made, as to reduce output capacity by 50%.
Not doubting you brother because I "know" you know your way around this machine, and because we so many times have traveled the same paths of thought. And it's entirely possible I'm missing a critical point here. But damn...I wanted the stated video speeds, and I'm not getting what Qual and company promised me. and in a direct comparison to the note2 quad, it's night and day as I watch the same video on my note1 next to my wifes 2. The odds are in favor of 2 cores running low speed on the quad core unit, as opposed to our note 1 running a single core wide open until the second one is needed. That of course was the basis for my statement.
OP can tweak for many great improvements, but I personally feel like we were duped on the claimed output of the 8660.....g
Just get a wife - then the phone lag will seem so trivial.
LOL .....he speaks the truth ....g

Categories

Resources