[Q] 4430 vs 4460 - Motorola Droid RAZR

How big is the performance difference between this SOCs? and even though 4460 is more powerful will we see performance changes because the Galaxy Nexus uses a higher pixel count?

Razr will have 4460 - the same CPU as Galaxy Nexus. Information about 4430 chipset - is just a first guess. Motorola and their distributors confirmed it will have 4460 version
Click to expand...
Click to collapse
UPD: I was wrong. They changed information on motodev website. Now the specs says its 4430. http://developer.motorola.com/products/razr-xt910/

nailll said:
Razr will have 4460 - the same CPU as Galaxy Nexus. Information about 4430 chipset - is just a first guess. Motorola and their distributors confirmed it will have 4460 version
Click to expand...
Click to collapse
Can you cite that? Everything that I've been reading about the RAZR has suggested otherwise.
Also read this: http://androidforums.com/motorola-droid-razr/431262-4430-4460-update-2-does-not-have-4460-a.html
Explains the differences between the 2 chips.

They posted 4430 on motodev portal a few days ago

It's 4430.
http://developer.motorola.com/products/droid-razr-xt912/

Damn. Sorry. I was confused. http://www.droid-life.com/2011/10/1...with-full-specs-omap4460-processor-confirmed/

Galaxy Nexus uses TI OMAP 4460 at 1,2GHz CPU and 304MHz GPU.
It doesn't use full speed of 1.5GHz CPU and 384MHz GPU.
So frequencies are the same between Galaxy Nexus and Moto Razr.
Razr has lower resolution 960x540 vs 1280x720.
So Razr should be faster.

should the 4460 be more efficient than the 4430 at 1.2GHz?

Diagrafeas said:
Galaxy Nexus uses TI OMAP 4460 at 1,2GHz CPU and 304MHz GPU.
It doesn't use full speed of 1.5GHz CPU and 384MHz GPU.
So frequencies are the same between Galaxy Nexus and Moto Razr.
Razr has lower resolution 960x540 vs 1280x720.
So Razr should be faster.
Click to expand...
Click to collapse
Where are you seeing that the GPU clock in the Galaxy Nexus is 304MHz?

Chirality said:
Where are you seeing that the GPU clock in the Galaxy Nexus is 304MHz?
Click to expand...
Click to collapse
Here are the specs: http://en.wikipedia.org/wiki/Texas_Instruments_OMAP#OMAP_4
Galaxy nexus has 384 mhz

soremir said:
Here are the specs: http://en.wikipedia.org/wiki/Texas_Instruments_OMAP#OMAP_4
Galaxy nexus has 384 mhz
Click to expand...
Click to collapse
CPU and GPU frequencies are linked.
So i strongly doudt that with a 1,2 GHz CPU you can get 384MHz GPU.

Diagrafeas said:
CPU and GPU frequencies are linked.
So i strongly doudt that with a 1,2 GHz CPU you can get 384MHz GPU.
Click to expand...
Click to collapse
What makes you think that the CPU and GPU clocks are linked?

Chirality said:
What makes you think that the CPU and GPU clocks are linked?
Click to expand...
Click to collapse
They aren't... I don't know where he got that.

didibabawu said:
should the 4460 be more efficient than the 4430 at 1.2GHz?
Click to expand...
Click to collapse
With emphasis on "should", yes. All things constant in the wild world of chip yields, the 4430 "should" require more effort to reach 1.2ghz. Not long to find out.

rushless said:
With emphasis on "should", yes. All things constant in the wild world of chip yields, the 4430 "should" require more effort to reach 1.2ghz. Not long to find out.
Click to expand...
Click to collapse
I believe the 4430 actually comes in two flavors, a 1ghz and a 1.2ghz. I don't believe they are taking the 1ghz processor and overclocking it.

My understanding is the 4430 and 4460 are from the same wafers. The 4430's are just the ones that did not run reliably at 1.5 but would at 1.2. Kind of the samw on pc processors. AMD had some quad cores that would only run reliably on 3 cores. Instead of throwing them away, change the model number and sell them. This has been common for years. So it could be possible someone's 4430 might run reliably at 1.4.

Oaklands said:
My understanding is the 4430 and 4460 are from the same wafers. The 4430's are just the ones that did not run reliably at 1.5 but would at 1.2. Kind of the samw on pc processors. AMD had some quad cores that would only run reliably on 3 cores. Instead of throwing them away, change the model number and sell them. This has been common for years. So it could be possible someone's 4430 might run reliably at 1.4.
Click to expand...
Click to collapse
^ That is all that needs to be said.

Oaklands said:
My understanding is the 4430 and 4460 are from the same wafers. The 4430's are just the ones that did not run reliably at 1.5 but would at 1.2. Kind of the samw on pc processors. AMD had some quad cores that would only run reliably on 3 cores. Instead of throwing them away, change the model number and sell them. This has been common for years. So it could be possible someone's 4430 might run reliably at 1.4.
Click to expand...
Click to collapse
Usually when chips are binned this way, the higher binned and lowered binned chips tend to be released at about the same time. However, there's a several months gap between the release of OMAP4430-based devices and OMAP4460-based devices, which seems to indicate that they are manufactured separately. Granted, this is the SoC-market with long lead times and complicated device development cycles so perhaps the chips were available at the same time but it has just taken longer for OMAP4460 devices to reach market, but the big gap between release frames suggest to me that these two SoCs are developed separately.

Chirality said:
Usually when chips are binned this way, the higher binned and lowered binned chips tend to be released at about the same time. However, there's a several months gap between the release of OMAP4430-based devices and OMAP4460-based devices, which seems to indicate that they are manufactured separately. Granted, this is the SoC-market with long lead times and complicated device development cycles so perhaps the chips were available at the same time but it has just taken longer for OMAP4460 devices to reach market, but the big gap between release frames suggest to me that these two SoCs are developed separately.
Click to expand...
Click to collapse
There's nothing preventing them from releasing them months apart.

zetsumeikuro said:
There's nothing preventing them from releasing them months apart.
Click to expand...
Click to collapse
Yes there is, inventory. If they are binning chips off the same line but they are only selling the lower binned ones but holding off on selling the higher binned ones for several months, then they are piling up inventory of the higher clocked chips and not doing anything with them.
Now it is possible that the yield on the higher clocked chips is very low, such that only after several months of binning did they have enough inventory to move them to OEMs. But then this would mean that you'll probably have a harder time overclocking the 4430s due to how much difficulty they had with yields for higher clocked chips.

Related

Question on processor based on factsheet

Motorola url: http://mediacenter.motorola.com/Fact-Sheets/Motorola-ATRIX-4G-Fact-Sheet-353b.aspx(Screenshot attached for those who are on device.)
Line of interest:
Processor: 2 processor cores running at 1GHz each
Nvidia url: http://www.nvidia.com/object/tegra-2.html
Lines of interest:
CPU: Dual-Core ARM Cortex A9
Frequency: 1 GHz, per core
Does this mean we have an effective 2GHz processing power in this device.
On a side note, my laptop is a quad core 2GHz, with each core at ~500MHz adding up to 2GHz in all. So that line got be confused thinking.
I've never heard of a 500Mhz quad core processor, but I have heard of a 2Ghz quad core processor, effectively providing 8GHz of processing power.
Nah, it really doesn't work like that. Each core will only run at 1ghz MAX, the benefit to having a second (or more) cores is that while you are doing something the second core is doing background stuff and you aren't getting bogged down. Or if the app supports it it can use both. Here's where things get fun....if your app uses both cores running at 1 ghz each it can TECHNICALLY process as fast as a 2ghz SINGLE CORE but its more like you get 50%-75% more performance from the second core. So I guess TECHNICALLY it would be the same as a single core 2ghz CPU...but at the same time not really? A 2ghz single would do things faster on single tasks, but multitasking the dual core is way better IMO. Hope that helps some.
harolds said:
I've never heard of a 500Mhz quad core processor, but I have heard of a 2Ghz quad core processor, effectively providing 8GHz of processing power.
Click to expand...
Click to collapse
Check in any CPU analyzer. Mine is a quad core, and each processor gets noted as ~500 MHz. I think u can find it even in 'device manager'.
Initially I too thought that I was getting 8GHz of power in my CPU, only to find later that it was infact 500x4.
Strange, my office system (desktop) is a dual core, and shows it at each core at 3GHz. Will check once more on my laptop when I get home. This is crazy!
But good to know. Even the graphics part of it has 8 cores. Was going through the specs. It rocks!
diablo009 said:
Check in any CPU analyzer. Mine is a quad core, and each processor gets noted as ~500 MHz. I think u can find it even in 'device manager'.
Initially I too thought that I was getting 8GHz of power in my CPU, only to find later that it was infact 500x4.
Strange, my office system (desktop) is a dual core, and shows it at each core at 3GHz. Will check once more on my laptop when I get home. This is crazy!
But good to know. Even the graphics part of it has 8 cores. Was going through the specs. It rocks!
Click to expand...
Click to collapse
Are you sure your processor hasn't been underclocked as part of some sort of battery saving feature? I don't think most applications can even utilize all 4 cores, which would mean individual applications would perform...pretty slowly. Right?
chbearsrock said:
Are you sure your processor hasn't been underclocked as part of some sort of battery saving feature? I don't think most applications can even utilize all 4 cores, which would mean individual applications would perform...pretty slowly. Right?
Click to expand...
Click to collapse
This has always been baffling me. I'll check today evening n update here. But now I am super happy abt the processor in atrix.
if you are in windows run cpu-z and post a screen shot.
skaboss610 said:
if you are in windows run cpu-z and post a screen shot.
Click to expand...
Click to collapse
Here u go.
its this processor. each core runs at a clock of 2ghz
http://ark.intel.com/Product.aspx?id=40480
according that screen shot, you have
2GHZ * 4CORES = 8GHZ
so... you had 8ghz all along!
Techcruncher said:
according that screen shot, you have
2GHZ * 4CORES = 8GHZ
so... you had 8ghz all along!
Click to expand...
Click to collapse
Aaah!!! No wonder I paid $1300 for this laptop in July 2009. And no wonder games released even in 2011 are playing so well without any frame rate issue.
Thanks for clearing this up kind sir.

Evo 3d gpu

So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
I guess I should rephrase one of my questions. I'm asking how it will run the emulators because I saw someone using a SG playing on the N64oid and it seemed pretty laggy, and if i'm not mistaken that has the same/similar GPU to the NS4G?
tannerw_2010 said:
So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
Click to expand...
Click to collapse
the emulators use the CPU, the Evo 3D will be fine, the PSX emulator runs fine on my 18 month old Desire
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
[email protected] said:
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
Click to expand...
Click to collapse
Yeah, I've heard that too. So it makes me wonder whats really true? It might tell you something that I heard the GPU isn't very good from the NS boards ... but I think i've heard it on these boards too, just not near as much
Look up you tube videos of the gpu in action. Nuff said
Sent from my Nexus S 4G using XDA Premium App
Maybe this will calm your fears
http://www.youtube.com/watch?v=DhBuMW2f_NM
Here a better one
http://www.youtube.com/watch?v=Ehfyxvh2W4k&feature=youtube_gdata_player
Sent from my Nexus S 4G using XDA Premium App
The GPU in the Evo3D should be the best out right now. Supposed to be up to twice as fast/powerful as Tegra2. It does appear that some optimizations need to be done to take advantage of this GPU though, hence some of the early, low benchmarks.
The GPU is the fastest right now. NO need to specualte, it will be until tegra 3 comes out, but I think it will still match tegra 3 in most benchmarks. SGX540 is good but adreno 220 is faster.
What about de CPU ? It's worst than the Galaxy S CPU or better ?
jamhawk said:
What about de CPU ? It's worst than the Galaxy S CPU or better ?
Click to expand...
Click to collapse
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
a5ehren said:
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
Click to expand...
Click to collapse
Depends if the US Galaxy 2's are going to be Tegra or Exynos
donatom3 said:
Depends if the US Galaxy 2's are going to be Tegra or Exynos
Click to expand...
Click to collapse
Now that's gonna make all the difference.
Sent from my PC36100 using XDA Premium App
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
nkd said:
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
Click to expand...
Click to collapse
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
firmbiz94 said:
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
Click to expand...
Click to collapse
This thread slightly confuses me. The OP mentions the NS4G in the first post, then we have someone coming in asking about comparisons to the Galaxy S, (S or S2?) and everyone answers about the GS2. Quick stat breakdown to answer whatever question is actually being asked here
Nexus S 4G has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S2 (Euro) has:
1.2 dual core Orion CPU
Mali 400 GPU
Evo 3D Has:
1.2 dual core MSM8660 CPU
Adreno 220 GPU
(Infoz from GSMArena)
The Nexus S and Galaxy S are last generation's phones, so to answer the OP... No. The Evo 3d doesn't have the same GPU/CPU as the NS4G. Not even similar. It's a generation (Maybe even 2) up. The Evo 4g is slightly slower than the NS4G, and it's running a 1.0 snapdragon with an Adreno 200 (Not even a 205, which is in the next in line before the 220).
As for the GS2 Vs. Evo 3D, they're supposed to be on par with each other, with the GS2 maybe being a bit faster, since Qualcomm isn't the best with GPU's. (Personal opinion) However, AFAIK nobody has done any real testing on the Sensation vs the GS2 (same CPU/GPU) so there's no real data backing up that claim... The GS2 DOES have better benchmark scores though, so take that as you will.
Disclaimer: I found all the numbers on the internets. They may be wrong.
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Don't worry too much about the a8 vs a9 thing...tje differences are not huge..45nm vs 40 nm .also Qualcomm heavily optimized the scorpion that it can actually perform processes that a9 can't..it will provide plenty of power..I would go into more details but that seems to upset some people on other threads
peacekeeper05 said:
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Click to expand...
Click to collapse
Almost all of the GPU benchmarks I've seen go like this:
1. Qualcomm
2. TI omap
3. Exynos
4. Tegra 2
5. Hummingbird
Qualcomm uses a8 because they don't use the reference designs from arm. Snapdragon outperforms the cortex a8 reference by 20-30% making it pretty close to the a9 reference

[Q] Evo3D processor

Simple question. Is the 3VO's processor really 1.5 ghz underclocked to 1.2? I had seen this information floating around, but none of my searches are able to find anything firmly confirming or denying this.
Thanks
That's what I've also heard, however I still can't find anything to confirm or deny.
Nobody knows, eh?
Yes it is underclocked.
Appreciate my help? Thank me
DDiaz007 said:
Yes it is underclocked.
Appreciate my help? Thank me
Click to expand...
Click to collapse
Sources????
You can't be serious? This has been discussed and answered dozens of times... Google MSM8660..
Appreciate my help? Thank me
DDiaz007 said:
You can't be serious? This has been discussed and answered dozens of times... Google MSM8660..
Appreciate my help? Thank me
Click to expand...
Click to collapse
That doesn't help, the MSM8660 comes in a 1.2 Ghz and a 1.5 Ghz variant.
poweroutlet said:
That doesn't help, the MSM8660 comes in a 1.2 Ghz and a 1.5 Ghz variant.
Click to expand...
Click to collapse
........
Appreciate my help? Thank me
It comes in two different factory clocks, which is what you said.. One is lower than the other because of manufacturer requests and the it being pointless to have 1.5 on a phone. If I were to pull the CPU's supported frequencies, it will say it supports 1512000, which is 1.5Ghz. The 8672 comes factory clocked at 1.5Ghz... They are all the same SoC, but with different applications. Such as one being CDMA support other being GSM. The ones that come in 1.2Ghz is because it is being used on a phone. If it were a tablet, or netbook, the clock would be 1.5Ghz which would be the 8672 or 8660..
Rest assured that 1.5Ghz is a frequency supported for the 8660...
In the end, they are the same SoC, running the same architecture. There is nothing different from the MSM 8260, 8660 and 8672 (which is cancelled). They are all under the 45nm process also.
Appreciate my help? Thank me
DDiaz007 said:
It comes in two different factory clocks, which is what you said.. One is lower than the other because of manufacturer requests and the it being pointless to have 1.5 on a phone. If I were to pull the CPU's supported frequencies, it will say it supports 1512000, which is 1.5Ghz. The 8672 comes factory clocked at 1.5Ghz... They are all the same SoC, but with different applications. Such as one being CDMA support of GSM. The ones that come in 1.2Ghz is because it is being used on a phone. If it were a tablet, or netbook, the clock would be 1.5Ghz
Appreciate my help? Thank me
Click to expand...
Click to collapse
Too bad you can't be sure of that. That MAY be the case, but it may also be the case that the 1.2 MSM8660s are the lower binned chips and the 1.5 are the higher binned units. This is done all the time in the CPU world. Someone gave an example here of how AMD sold the Barton 2500+ CPU which was really just a lower binned 3200+, a CPU that was far more expensive.
Your point that they are all the same SOC is not relevant, Intel and AMD for example have sold many processors which are all identical in architecture and every spec down to TDP, and the only difference is the frequency. It is just that the higher binned chips become the higher speced CPUs and the lower binned ones become the lower end ones. This doesn't mean that a lower binned CPU can't exceed its specification but it does mean that its likely that the higher binned CPU can go even higher. In any case, they are certainly not equal.
Just because they are the same SOC, does not mean you can assume that the 1.2 and 1.5 Ghz units are the same. That's like assuming the Intel Pentium 4 2.4C and the 3.0C are the same. They are the exact same CPU, same architecture, same cache, FSB, etc except one is clocked a bit higher and is of a higher bin. The 3.0C was the superior unit (Higher bin, better ability to overclock, etc).
My point is, we don't actually know if Qualcomm is giving us simply downclocked versions of the 1.5 or if our 1.2s are just lower binned 1.5s. The latter would make more sense for them in terms of profits, therefore its not surprising that this is a common practice in the industry.
poweroutlet said:
Too bad you can't be sure of that. That MAY be the case, but it may also be the case that the 1.2 MSM8660s are the lower binned chips and the 1.5 are the higher binned units. This is done all the time in the CPU world. Someone gave an example here of how AMD sold the Barton 2500+ CPU which was really just a lower binned 3200+, a CPU that was far more expensive.
Your point that they are all the same SOC is not relevant, Intel and AMD for example have sold many processors which are all identical in architecture and every spec down to TDP, and the only difference is the frequency. It is just that the higher binned chips become the higher speced CPUs and the lower binned ones become the lower end ones. This doesn't mean that a lower binned CPU can't exceed its specification but it does mean that its likely that the higher binned CPU can go even higher. In any case, they are certainly not equal.
Just because they are the same SOC, does not mean you can assume that the 1.2 and 1.5 Ghz units are the same. That's like assuming the Intel Pentium 4 2.4C and the 3.0C are the same. They are the exact same CPU, same architecture, same cache, FSB, etc except one is clocked a bit higher and is of a higher bin. The 3.0C was the superior unit (Higher bin, better ability to overclock, etc).
My point is, we don't actually know if Qualcomm is giving us simply downclocked versions of the 1.5 or if our 1.2s are just lower binned 1.5s. The latter would make more sense for them in terms of profits, its not surprise that this is a common practice in the industry.
Click to expand...
Click to collapse
I see what you are talking about.. I forgot about bins. I know for it on PC's, but didn't think much of it for a smartphone.
Appreciate my help? Thank me
I'm going to say you may be right about the bins. There are some people on here who can't reach past 1.5 for the life of god.
Appreciate my help? Thank me
DDiaz007 said:
I see what you are talking about.. I forgot about bins. I know for it on PC's, but didn't think much of it for a smartphone.
Appreciate my help? Thank me
Click to expand...
Click to collapse
Yeah, regardless though, our CPUs are already doing 1.8 stable and maybe even higher, that's plenty fast for me so I don't really care if the 1.5s are even better at clocking (well I might care if I start seeing the 1.5 phones breaking 2 Ghz haha).
poweroutlet said:
Yeah, regardless though, our CPUs are already doing 1.8 stable and maybe even higher, that's plenty fast for me so I don't really care if the 1.5s are even better at clocking (well I might care if I start seeing the 1.5 phones breaking 2 Ghz haha).
Click to expand...
Click to collapse
Yea me too
Appreciate my help? Thank me
You've been thanked for reminding me of the bins. Not once did that come into mind.
#fail
Appreciate my help? Thank me
DDiaz007 said:
You've been thanked for reminding me of the bins. Not once did that come into mind.
#fail
Appreciate my help? Thank me
Click to expand...
Click to collapse
No worries man.

Evo 3D's asynchronous dual core?

I was just thinking about something. Is it really a fair comparison between an asynchronous dual core and a conventional dual core such as the Tegra or the OMAP4? We all know how everyone loves to compare benchmarks on phones. Also, we all know that the 3d does horrible on Quadrant scores. Is this because of the type of cpu we have? If it is... Is it really fair to even try to compare them?
My thinking is that, if both of our cores ran at the same speed all of the time, our cpu would dominate everything on benchmarks. Am I wrong in thinking that? Is there any way we would truly know?
Ps. Hope this isn't dumb thinking. If it is, please just state why and move on. I am NOT trying to start any flame war or troll thread. This is a 100% completely sincere question.
Thanks in advance!
Sent by my supercharged dual core from the 3rd dimension.
Benchmark scores mean **** anyways. I don't know why people insist on using them. If the phone runs well, it runs well
Tad slower mostly because its based on a similar ARM cortex A8 design. Those other ones, like galaxy s2 or other SOC's are based on the newer cortex A9 designs. Been analyzed several times over anandtech or other sites. Besides those benchmarks are not dual core at all. So we are apples to apples. Difference is in designs. If you compare two cpus clocked at same speeds (snapdragon/A8 vs A9) A9 will come ahead.
Sent from my PG86100 using XDA App
I understand that benchmarks don't mean anything. I just want to know if the fact that our cpu is asynchronous had anything to do with the exceptionally low scores compared to other devices.
Sent from my PG86100 using XDA App
I'd chalk it up to the fact that the most recent OMAP and Exynos are based on A9 while our scorpion cores are heavily modified A8 designs by qualcomm.
Ours are in between A8 and A9.
Sent from my PG86100 using xda premium
I briefly und understand the difference between A9 and A8 based chips but I personally think the current snapdragon in the shooter (msm8660?) is a much superior chip then the tegra 2. I got tiered of my og evo so I bought the shooter off contract from a buddy for cheap and plan to get the nexus prime which I belive will land at sprint before January (contract up). The rumors are that will use OMAP 4660 clocked at 1.5. Just rumors I know. But how will that compare to the snapdragon in terms of speed and battery?
Sent from my PG86100 using Tapatalk
ROM synergy 318 OC 1.8 (2.3.3 base) literally SMOKED the sgs2, was hitting 4000+ with quadrant advanced, but yeah, scores mean nothing. We should have OC again soon, and get nice shiny scores again.
From what I have been reading, A8, A9, v6, v7 or whatever there is now doesn't really equate to any performance gains. The companies license from ARM or they can create their own SoC based on ARM, so its kind of like saying there's an Intel Core 2 Duo and then a AMD Athlon X2, but they are both based on x86 architecture. There's a lot of confusion regarding the whole A8 A9 terminology, so honestly, I don't think it matters much what ARM revision or whatever our SoC is using in the Evo 3D.
What I would really like to know is if the Asynchronous part of it is making a difference in the scores. Does anyone know this? That is the biggest question I have.
Hard to really say which processor is more powerful; but at this stage in smartphones all the dual cores seem to be powerful enough to where it doesn't matter. Asynchronous vs the other guys may be a different story though. Asynchronous cores means each core can be at a different clock speed, so when we get the next version to android (in October or November) and we get to take full advantage of dual core support we may have significantly better battery life than them.
So to elaborate on what you want i guess: Asynchronous cores has nothing to do with the benchmarks because these benchmarks are only running one core anyway (i'm pretty sure).
sprinttouch666 said:
Hard to really say which processor is more powerful; but at this stage in smartphones all the dual cores seem to be powerful enough to where it doesn't matter. Asynchronous vs the other guys may be a different story though. Asynchronous cores means each core can be at a different clock speed, so when we get the next version to android (in October or November) and we get to take full advantage of dual core support we may have significantly better battery life than them.
Click to expand...
Click to collapse
Ok. Now, what about performance wise? Will we be at an advantage or disadvantage?
lyon21 said:
Ok. Now, what about performance wise? Will we be at an advantage or disadvantage?
Click to expand...
Click to collapse
Check this out if you are worried about performance. I think this pretty much sums up how powerful the new snapdragon chipset
http://www.qualcomm.com/blog/2011/04/27/next-gen-snapdragon-dual-core-mdp
lyon21 said:
What I would really like to know is if the Asynchronous part of it is making a difference in the scores. Does anyone know this? That is the biggest question I have.
Click to expand...
Click to collapse
Depends... If you are benchmarking with a non multithreaded app like quadrant, it doesn't matter as you're running on a single core on both. A9 will be faster. And if you're running a multithreaded benchmark that fully uses both cores then the "asynchronous" thing goes out of play as you're using both cores on both devices.
Sent from my PG86100 using XDA App
il Duce said:
ROM synergy 318 OC 1.8 (2.3.3 base) literally SMOKED the sgs2, was hitting 4000+ with quadrant advanced, but yeah, scores mean nothing. We should have OC again soon, and get nice shiny scores again.
Click to expand...
Click to collapse
Well, then if you overclock an A9 to 1.8 ghz you're back to square one and A9 is still faster. I think Qualcomm has already announced their roadmap and a A9 killer is on its way. I think its a quad core with adreno 3xx (will also have dual core with updated architecture to beat A9, but then ARM is coming up with the A15 Hahaha, the never ending race)
Sent from my PG86100 using XDA App
sn0b0ard said:
From what I have been reading, A8, A9, v6, v7 or whatever there is now doesn't really equate to any performance gains. The companies license from ARM or they can create their own SoC based on ARM, so its kind of like saying there's an Intel Core 2 Duo and then a AMD Athlon X2, but they are both based on x86 architecture. There's a lot of confusion regarding the whole A8 A9 terminology, so honestly, I don't think it matters much what ARM revision or whatever our SoC is using in the Evo 3D.
Click to expand...
Click to collapse
Yes it matters, like your comparison, each chip has new sets of instructions, pipelines and optimization. Clock for clock, and like other guy said our snapdragons are between an A8 and A9 and the A9 is simply faster. Ours is an older architecture. By no means a slouch, but its the truth.
Sent from my PG86100 using XDA App
jamexman said:
Yes it matters, like your comparison, each chip has new sets of instructions, pipelines and optimization. Clock for clock, and like other guy said our snapdragons are between an A8 and A9 and the A9 is simply faster. Ours is an older architecture. By no means a slouch, but its the truth.
Sent from my PG86100 using XDA App
Click to expand...
Click to collapse
See, here's the thing. Qualcomm doesn't just go stock ARM architecture. They licensed the technology and made their own snapdragon chipset. Is the snapdragon chipset family old? Yes, it has been around for a while. Is the chipset that is in the Evo 3D old? Not really. It was just developed by Qualcomm relatively recently and expands on their existing, proven QSD chipset. This is like comparing apples to oranges, they are just two different SoCs. If you were to take an absolutely stock ARMv9 and put it against an absolutely stock ARMv7/8, then yes, the ARMv9 obviously is going to win, but these companies try and market that their CPUs are one version higher than others, when in all reality, they modify the hell out of the ARM architecture to make their chipsets.
sn0b0ard said:
Check this out if you are worried about performance. I think this pretty much sums up how powerful the new snapdragon chipset
http://www.qualcomm.com/blog/2011/04/27/next-gen-snapdragon-dual-core-mdp
Click to expand...
Click to collapse
Totally off topic Sorrrry!!!
Just followed the link above and WOW!! how can we con Qualcom into giving us a copy of that home launcher they use with the live wallpaper as well..HMMMMM
jamexman said:
Well, then if you overclock an A9 to 1.8 ghz you're back to square one and A9 is still faster. I think Qualcomm has already announced their roadmap and a A9 killer is on its way. I think its a quad core with adreno 3xx (will also have dual core with updated architecture to beat A9, but then ARM is coming up with the A15 Hahaha, the never ending race)
Sent from my PG86100 using XDA App
Click to expand...
Click to collapse
It is much harder to push a A9 based SOC to 1.8 Ghz compared to the A8 based MSM8660. Clock per clock, A9 will be faster. The A9 has greater IPC and a shorter pipeline, but this also prevents the A9 from running at as high frequencies as an A8 based SOC. How many 1.8 Ghz Exynos chips do you see? In some regards the MSM8660 clearly beats some A9 based SOCs like the Tegra 2 which even lacks hardware support for NEON instructions. Snapdragons have also always traditionally had high floating point performance too.
Also there is no competition between Qualcomm and ARM. Qualcomm simply licenses designs from ARM and then customizes them for its own needs.

Nexus Prime/Galaxy to have same GPU as our phone?

According to an article today by Android Police, they have strong confirmation that the Nexus Prime/Galaxy will have a T.I. OMAP 4460 SoC(System on a chip) down clocked from 1.5 to 1.2GHz. The OMAP 4460 has the PowerVR 540 GPU which is what is present in our phones. If this is true, I will probably pass on it. But I did a little research and found out that the T.I. OMAP 4470 SoC is due for late 2011 or early 2012. Perhaps Google/Samsung will work with T.I. to debut this new SoC. The OMAP 4470 has a clock speed of 1.8GHz and contains the PowerVR 544 (more powerful than the iPad 2/iPhone 4S). Surely Google would not want a GPU found in last years models to be in their new flagship phone. What are your thoughts?
Zacisblack said:
According to an article today by Android Police, they have strong confirmation that the Nexus Prime/Galaxy will have a T.I. OMAP 4460 SoC(System on a chip) down clocked from 1.5 to 1.2GHz. The OMAP 4460 has the PowerVR 540 GPU which is what is present in our phones. If this is true, I will probably pass on it. But I did a little research and found out that the T.I. OMAP 4470 SoC is due for late 2011 or early 2012. Perhaps Google/Samsung will work with T.I. to debut this new SoC. The OMAP 4470 has a clock speed of 1.8GHz and contains the PowerVR 544 (more powerful than the iPad 2/iPhone 4S). Surely Google would not want a GPU found in last years models to be in their new flagship phone. What are your thoughts?
Click to expand...
Click to collapse
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
nope, it's samsung. you can take off your tinfoil hat since that was officially confirmed about a year ago.
op, where did you get that information? it's been stated that it will have an exynos processor, the latest and greatest from samsung. I don't have a source but the whole point of the nexus line is to have the best and latest hardware.
Sent from my MIUI SCH-i500
sageDieu said:
nope, it's samsung. you can take off your tinfoil hat since that was officially confirmed about a year ago.
op, where did you get that information? it's been stated that it will have an exynos processor, the latest and greatest from samsung. I don't have a source but the whole point of the nexus line is to have the best and latest hardware.
Sent from my MIUI SCH-i500
Click to expand...
Click to collapse
Not saying it's 100% but 4/5 Android websites have concluded that the OMAP series is the platform of choice for Google's new OS. No tech blog/website has stated it will have Exynos. And the OMAP 4470 would be more powerful either way. But below, Android Police strongly asserted that the new device will have the OMAP 4460 downclocked to 1.2GHz. But like I said, I'm asking for everyone's thoughts because I can definitely see Google surprising us.
http://www.androidpolice.com/2011/1...eam-sandwich-phone-sorry-prime-is-not-likely/
You can also check Engadget, AndroidCentral, Anandtech, Android Authority,and PhanDroid.
tonu42 said:
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
Click to expand...
Click to collapse
You could be partially right. Some rumors have suggested that the Prime and Galaxy Nexus are two different devices. What saddens me is that the Galaxy Nexus I-9250 passed through the FCC with GSM only.
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Sent from my Fascinate with MIUI Gingerbread
TheSonicEmerald said:
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Click to expand...
Click to collapse
184mhz, I think -- almost double. Except the Nexus is going to have 2.4 times the pixels of the Fascinate (or 2.22 if you don't count the soft key area).
tonu42 said:
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
Click to expand...
Click to collapse
oh tonu, still trying to have conversations about things you know nothing about.
Sent from my Incredible 2 using XDA App
TheSonicEmerald said:
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Sent from my Fascinate with MIUI Gingerbread
Click to expand...
Click to collapse
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Zacisblack said:
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Click to expand...
Click to collapse
Hah. Imagine having the PowerVR SGX 543MP4 from the PS vita in the prime. That would run laps around the MP2 XD
Zacisblack said:
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Click to expand...
Click to collapse
I don't understand why google put such a crappy GPU in their flagship phone. They easily could have put the Mali GPU or maybe even the 543MP2. Now I really can't decide between the 4S and the Galaxy Nexus...
cherrybombaz said:
I don't understand why google put such a crappy GPU in their flagship phone. They easily could have put the Mali GPU or maybe even the 543MP2. Now I really can't decide between the 4S and the Galaxy Nexus...
Click to expand...
Click to collapse
They probably put it in to work around the hardware. This means that the Galaxy Prime will run extremely well with ICS probably better than some dual core GPU phones but it will lack in the gaming department. If you don't really game alot it shouldn't matter that much it will be really fast. They've also increase the clock speed from 200Mhz to 386 Mhz which is almost twice as fast.
I thought about the 4S thing too but then I realized, "why have all that power if the system takes little use of it?". The only thing it's really good for is gaming but who want's to do that on a 3.5" screen. At this point, the Nexus is probably a better real world choice but if you wait a few more months the GSII HD LTE or the GS3 will be out and will probably be on par with the iPad 3 in terms of hardware. I was hoping the Nexus would blow me away but it didn't. I like the way it looks but the hardware is just lacking and it's not worth my upgrade or $300.
Very well stated I'm also not all in on the GN. We'll see once I can actually play with one in store next month
Sent from my SCH-I500 using XDA Premium App
Zacisblack said:
They probably put it in to work around the hardware. This means that the Galaxy Prime will run extremely well with ICS probably better than some dual core GPU phones but it will lack in the gaming department. If you don't really game alot it shouldn't matter that much it will be really fast. They've also increase the clock speed from 200Mhz to 386 Mhz which is almost twice as fast.
I thought about the 4S thing too but then I realized, "why have all that power if the system takes little use of it?". The only thing it's really good for is gaming but who want's to do that on a 3.5" screen. At this point, the Nexus is probably a better real world choice but if you wait a few more months the GSII HD LTE or the GS3 will be out and will probably be on par with the iPad 3 in terms of hardware. I was hoping the Nexus would blow me away but it didn't. I like the way it looks but the hardware is just lacking and it's not worth my upgrade or $300.
Click to expand...
Click to collapse
True. But Infinity Blade 2 looks pretty amazing and if more developers can take advantage of the 543MP2, that would be great. But, you can always wait a few more months and something better will always come out, so I don't think its a good idea to wait for the GS3 - and it'll take much more than a few months to get onto US carriers. I agree that $300 is a bit of a hard pill to swallow, especially when you can get a GSII with better hardware for cheaper.

Categories

Resources