Which One Is Best??? - Atrix 4G Q&A, Help & Troubleshooting

which one is best
ARM Cortex-A9 processor, ULP GeForce GPU, Tegra 2 AP20H chipset
OR
TI OMAP 4430 and PowerVR SGX540.

NVIDIA AP
GL_EXT_bgra,
GL_EXT_texture_compression_dxt1,
GL_EXT_texture_compression_s3tc,
GL_EXT_texture_format_BGRA8888,
GL_OES_byte_coordinates,
GL_OES_compressed_ETC1_RGB8_texture,
GL_OES_compressed_paletted_texture,
GL_OES_draw_texture,
GL_OES_EGL_image,
GL_OES_extended_matrix_palette,
GL_OES_fbo_render_mipmap,
GL_OES_fixed_point,
GL_OES_framebuffer_object,
GL_OES_matrix_get,
GL_OES_matrix_palette,
GL_OES_point_size_array,
GL_OES_point_sprite,
GL_OES_query_matrix,
GL_OES_read_format,
GL_OES_rgb8_rgba8,
GL_OES_single_precision,
GL_OES_stencil8,
GL_OES_texture_cube_map,
GL_OES_vertex_half_floa
PowerVR SGX 540
GL_EXT_multi_draw_arrays,
GL_EXT_texture_format_BGRA8888,
GL_IMG_read_format,
GL_IMG_texture_compression_pvrtc,
GL_IMG_texture_format_BGRA8888,
GL_IMG_texture_stream,
GL_IMG_vertex_array_object,
GL_OES_blend_equation_separate,
GL_OES_blend_func_separate,
GL_OES_blend_subtract,
GL_OES_byte_coordinates,
GL_OES_compressed_ETC1_RGB8_texture,
GL_OES_compressed_paletted_texture,
GL_OES_depth24,
GL_OES_draw_texture,
GL_OES_EGL_image,
GL_OES_egl_sync,
GL_OES_extended_matrix_palette,
GL_OES_fixed_point,
GL_OES_framebuffer_object,
GL_OES_mapbuffer,
GL_OES_matrix_get,
GL_OES_matrix_palette,
GL_OES_point_size_array,
GL_OES_point_sprite,
GL_OES_query_matrix,
GL_OES_read_format,
GL_OES_required_internalformat,
GL_OES_rgb8_rgba8,
GL_OES_single_precision,
GL_OES_stencil8,
GL_OES_stencil_wrap,
GL_OES_texture_cube_map,
GL_OES_texture_env_crossbar,
GL_OES_texture_mirrored_repeat
According to this all thing . i think that PowerVR SGX 540 is better the NVIDIA AP what do thing guys???????

WILL SOME ONE TELL ME

More doesn't mean better performance. Only way to truly tell is by testing both in side by side comparisons with the same hardware and software.

so tell me which one is best????

NOMIOMI said:
so tell me which one is best????
Click to expand...
Click to collapse
HAHAHA!! Have you read the responses to your question guy? Or do you just always respond with "Tell me which is best!"?

Best for what? Both don't perform comparably on different workloads. For example, say Tegra is better for OpenGL and OMAP outperforms on computational power (just an example, I don't have numbers here to tell you for real)

Related

Evo 3d gpu

So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
I guess I should rephrase one of my questions. I'm asking how it will run the emulators because I saw someone using a SG playing on the N64oid and it seemed pretty laggy, and if i'm not mistaken that has the same/similar GPU to the NS4G?
tannerw_2010 said:
So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
Click to expand...
Click to collapse
the emulators use the CPU, the Evo 3D will be fine, the PSX emulator runs fine on my 18 month old Desire
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
[email protected] said:
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
Click to expand...
Click to collapse
Yeah, I've heard that too. So it makes me wonder whats really true? It might tell you something that I heard the GPU isn't very good from the NS boards ... but I think i've heard it on these boards too, just not near as much
Look up you tube videos of the gpu in action. Nuff said
Sent from my Nexus S 4G using XDA Premium App
Maybe this will calm your fears
http://www.youtube.com/watch?v=DhBuMW2f_NM
Here a better one
http://www.youtube.com/watch?v=Ehfyxvh2W4k&feature=youtube_gdata_player
Sent from my Nexus S 4G using XDA Premium App
The GPU in the Evo3D should be the best out right now. Supposed to be up to twice as fast/powerful as Tegra2. It does appear that some optimizations need to be done to take advantage of this GPU though, hence some of the early, low benchmarks.
The GPU is the fastest right now. NO need to specualte, it will be until tegra 3 comes out, but I think it will still match tegra 3 in most benchmarks. SGX540 is good but adreno 220 is faster.
What about de CPU ? It's worst than the Galaxy S CPU or better ?
jamhawk said:
What about de CPU ? It's worst than the Galaxy S CPU or better ?
Click to expand...
Click to collapse
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
a5ehren said:
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
Click to expand...
Click to collapse
Depends if the US Galaxy 2's are going to be Tegra or Exynos
donatom3 said:
Depends if the US Galaxy 2's are going to be Tegra or Exynos
Click to expand...
Click to collapse
Now that's gonna make all the difference.
Sent from my PC36100 using XDA Premium App
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
nkd said:
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
Click to expand...
Click to collapse
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
firmbiz94 said:
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
Click to expand...
Click to collapse
This thread slightly confuses me. The OP mentions the NS4G in the first post, then we have someone coming in asking about comparisons to the Galaxy S, (S or S2?) and everyone answers about the GS2. Quick stat breakdown to answer whatever question is actually being asked here
Nexus S 4G has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S2 (Euro) has:
1.2 dual core Orion CPU
Mali 400 GPU
Evo 3D Has:
1.2 dual core MSM8660 CPU
Adreno 220 GPU
(Infoz from GSMArena)
The Nexus S and Galaxy S are last generation's phones, so to answer the OP... No. The Evo 3d doesn't have the same GPU/CPU as the NS4G. Not even similar. It's a generation (Maybe even 2) up. The Evo 4g is slightly slower than the NS4G, and it's running a 1.0 snapdragon with an Adreno 200 (Not even a 205, which is in the next in line before the 220).
As for the GS2 Vs. Evo 3D, they're supposed to be on par with each other, with the GS2 maybe being a bit faster, since Qualcomm isn't the best with GPU's. (Personal opinion) However, AFAIK nobody has done any real testing on the Sensation vs the GS2 (same CPU/GPU) so there's no real data backing up that claim... The GS2 DOES have better benchmark scores though, so take that as you will.
Disclaimer: I found all the numbers on the internets. They may be wrong.
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Don't worry too much about the a8 vs a9 thing...tje differences are not huge..45nm vs 40 nm .also Qualcomm heavily optimized the scorpion that it can actually perform processes that a9 can't..it will provide plenty of power..I would go into more details but that seems to upset some people on other threads
peacekeeper05 said:
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Click to expand...
Click to collapse
Almost all of the GPU benchmarks I've seen go like this:
1. Qualcomm
2. TI omap
3. Exynos
4. Tegra 2
5. Hummingbird
Qualcomm uses a8 because they don't use the reference designs from arm. Snapdragon outperforms the cortex a8 reference by 20-30% making it pretty close to the a9 reference

[Q] 4430 vs 4460

How big is the performance difference between this SOCs? and even though 4460 is more powerful will we see performance changes because the Galaxy Nexus uses a higher pixel count?
Razr will have 4460 - the same CPU as Galaxy Nexus. Information about 4430 chipset - is just a first guess. Motorola and their distributors confirmed it will have 4460 version
Click to expand...
Click to collapse
UPD: I was wrong. They changed information on motodev website. Now the specs says its 4430. http://developer.motorola.com/products/razr-xt910/
nailll said:
Razr will have 4460 - the same CPU as Galaxy Nexus. Information about 4430 chipset - is just a first guess. Motorola and their distributors confirmed it will have 4460 version
Click to expand...
Click to collapse
Can you cite that? Everything that I've been reading about the RAZR has suggested otherwise.
Also read this: http://androidforums.com/motorola-droid-razr/431262-4430-4460-update-2-does-not-have-4460-a.html
Explains the differences between the 2 chips.
They posted 4430 on motodev portal a few days ago
It's 4430.
http://developer.motorola.com/products/droid-razr-xt912/
Damn. Sorry. I was confused. http://www.droid-life.com/2011/10/1...with-full-specs-omap4460-processor-confirmed/
Galaxy Nexus uses TI OMAP 4460 at 1,2GHz CPU and 304MHz GPU.
It doesn't use full speed of 1.5GHz CPU and 384MHz GPU.
So frequencies are the same between Galaxy Nexus and Moto Razr.
Razr has lower resolution 960x540 vs 1280x720.
So Razr should be faster.
should the 4460 be more efficient than the 4430 at 1.2GHz?
Diagrafeas said:
Galaxy Nexus uses TI OMAP 4460 at 1,2GHz CPU and 304MHz GPU.
It doesn't use full speed of 1.5GHz CPU and 384MHz GPU.
So frequencies are the same between Galaxy Nexus and Moto Razr.
Razr has lower resolution 960x540 vs 1280x720.
So Razr should be faster.
Click to expand...
Click to collapse
Where are you seeing that the GPU clock in the Galaxy Nexus is 304MHz?
Chirality said:
Where are you seeing that the GPU clock in the Galaxy Nexus is 304MHz?
Click to expand...
Click to collapse
Here are the specs: http://en.wikipedia.org/wiki/Texas_Instruments_OMAP#OMAP_4
Galaxy nexus has 384 mhz
soremir said:
Here are the specs: http://en.wikipedia.org/wiki/Texas_Instruments_OMAP#OMAP_4
Galaxy nexus has 384 mhz
Click to expand...
Click to collapse
CPU and GPU frequencies are linked.
So i strongly doudt that with a 1,2 GHz CPU you can get 384MHz GPU.
Diagrafeas said:
CPU and GPU frequencies are linked.
So i strongly doudt that with a 1,2 GHz CPU you can get 384MHz GPU.
Click to expand...
Click to collapse
What makes you think that the CPU and GPU clocks are linked?
Chirality said:
What makes you think that the CPU and GPU clocks are linked?
Click to expand...
Click to collapse
They aren't... I don't know where he got that.
didibabawu said:
should the 4460 be more efficient than the 4430 at 1.2GHz?
Click to expand...
Click to collapse
With emphasis on "should", yes. All things constant in the wild world of chip yields, the 4430 "should" require more effort to reach 1.2ghz. Not long to find out.
rushless said:
With emphasis on "should", yes. All things constant in the wild world of chip yields, the 4430 "should" require more effort to reach 1.2ghz. Not long to find out.
Click to expand...
Click to collapse
I believe the 4430 actually comes in two flavors, a 1ghz and a 1.2ghz. I don't believe they are taking the 1ghz processor and overclocking it.
My understanding is the 4430 and 4460 are from the same wafers. The 4430's are just the ones that did not run reliably at 1.5 but would at 1.2. Kind of the samw on pc processors. AMD had some quad cores that would only run reliably on 3 cores. Instead of throwing them away, change the model number and sell them. This has been common for years. So it could be possible someone's 4430 might run reliably at 1.4.
Oaklands said:
My understanding is the 4430 and 4460 are from the same wafers. The 4430's are just the ones that did not run reliably at 1.5 but would at 1.2. Kind of the samw on pc processors. AMD had some quad cores that would only run reliably on 3 cores. Instead of throwing them away, change the model number and sell them. This has been common for years. So it could be possible someone's 4430 might run reliably at 1.4.
Click to expand...
Click to collapse
^ That is all that needs to be said.
Oaklands said:
My understanding is the 4430 and 4460 are from the same wafers. The 4430's are just the ones that did not run reliably at 1.5 but would at 1.2. Kind of the samw on pc processors. AMD had some quad cores that would only run reliably on 3 cores. Instead of throwing them away, change the model number and sell them. This has been common for years. So it could be possible someone's 4430 might run reliably at 1.4.
Click to expand...
Click to collapse
Usually when chips are binned this way, the higher binned and lowered binned chips tend to be released at about the same time. However, there's a several months gap between the release of OMAP4430-based devices and OMAP4460-based devices, which seems to indicate that they are manufactured separately. Granted, this is the SoC-market with long lead times and complicated device development cycles so perhaps the chips were available at the same time but it has just taken longer for OMAP4460 devices to reach market, but the big gap between release frames suggest to me that these two SoCs are developed separately.
Chirality said:
Usually when chips are binned this way, the higher binned and lowered binned chips tend to be released at about the same time. However, there's a several months gap between the release of OMAP4430-based devices and OMAP4460-based devices, which seems to indicate that they are manufactured separately. Granted, this is the SoC-market with long lead times and complicated device development cycles so perhaps the chips were available at the same time but it has just taken longer for OMAP4460 devices to reach market, but the big gap between release frames suggest to me that these two SoCs are developed separately.
Click to expand...
Click to collapse
There's nothing preventing them from releasing them months apart.
zetsumeikuro said:
There's nothing preventing them from releasing them months apart.
Click to expand...
Click to collapse
Yes there is, inventory. If they are binning chips off the same line but they are only selling the lower binned ones but holding off on selling the higher binned ones for several months, then they are piling up inventory of the higher clocked chips and not doing anything with them.
Now it is possible that the yield on the higher clocked chips is very low, such that only after several months of binning did they have enough inventory to move them to OEMs. But then this would mean that you'll probably have a harder time overclocking the 4430s due to how much difficulty they had with yields for higher clocked chips.

Really quad core?

Asus told us that the original Transformer was dual core but its actually 1ghz for computing and 1ghz for graphics (which i guess is dual core but not in the way we all thought.).
Just hoping they dont pull something similar with the Prime...
jleewong said:
Asus told us that the original Transformer was dual core but its actually 1ghz for computing and 1ghz for graphics (which i guess is dual core but not in the way we all thought.).
Just hoping they dont pull something similar with the Prime...
Click to expand...
Click to collapse
Huh? Not sure where you're pulling your information from, but Tegra 2 consists of a dual-core CPU with each core at 1Ghz and ULP GeFoce GPU running at 333Mhz.
Technically speaking its 5 cores and a gpu.
The Tegra 3, which the Prime has, consists of a 4 1.4GHz CPU cores along with another low-power core. It also has a 12 core GPU.
The TF101, the original transformer, had a Tegra 2. The Tegra 2 had 2 1GHz CPU cores and a 333MHz GPU core.
http :// en.wikipedia.org/wiki/Nvidia_Tegra
I guess it really was dual core, awhile back when I first purchased a Viewsonic G-tablet (I've had every other tegra 2 tablet since) I read a review that said tegra 2 wasnt really dual core because it was using 1ghz for cpu and another 1ghz for gpu. But according to the tegra 2 wiki it really is dual core.
Still dont understand why my Archos 70 runs webpages and go launcher better, it only has a single core 1ghz cpu . Maybe it was honeycomb that made the tegra 2 seem slugish.
Tegra 3 has the 4 cores and that single companion core, so it should have no problems I hope, expecially when ICS hits.
jleewong said:
I guess it really was dual core, awhile back when I first purchased a Viewsonic G-tablet (I've had every other tegra 2 tablet since) I read a review that said tegra 2 wasnt really dual core because it was using 1ghz for cpu and another 1ghz for gpu. But according to the tegra 2 wiki it really is dual core.
Still dont understand why my Archos 70 runs webpages and go launcher better, it only has a single core 1ghz cpu . Maybe it was honeycomb that made the tegra 2 seem slugish.
Tegra 3 has the 4 cores and that single companion core, so it should have no problems I hope, expecially when ICS hits.
Click to expand...
Click to collapse
The Tegra 2 is two ARM A9 chips @ 1.00Ghz (and later on, 1.2Ghz) CPU alongside an 8 core GeForce GPU.
The The Tegra 3 is a four ARM A9 chips @ 1.4GHz alongside a 12 core GeForce GPU + a 5th underclocked A9 for power-saving features.
xTRICKYxx said:
The Tegra 2 is two ARM A9 chips @ 1.00Ghz (and later on, 1.2Ghz) CPU alongside an 8 core GeForce GPU.
The The Tegra 3 is a four ARM A9 chips @ 1.4GHz alongside a 12 core GeForce GPU + a 5th underclocked A9 for power-saving features.
Click to expand...
Click to collapse
The The Tegra 3 is a four ARM A9 chips @ 1.3GHz (upto 1.4GHz in single core mode) alongside a 12 core GeForce GPU + a 5th underclocked A9 for power-saving features.
jleewong said:
Maybe it was honeycomb that made the tegra 2 seem slugish.
Click to expand...
Click to collapse
Probably. They made a mistake of not making the home screen hardware accelerated. It's supposed to be fixed in ICS.

ULP GeForce VS Adreno 205

Hi guys,
Please tell me which GPU will be the best for both Gaming and smoothly HD video playing, I know it's also depend on the other parameters and hardware but here m talking about only GPU so which will be best and more powerful the Nvidia ULP GeForce Or the Adreno 205 !!
Some body ans. plz ..
hmm in my personal opinion i would go for the nvidia gpu...dont know all the specs of both processors but have benchmarked two 1ghz dual core phones one with nvidia gforce and one with qualcomm adreno 205 and the gforce just is more overstanding with games benchmarks and videos but the mali that is inside the samsung galaxy s2 exynos beats them both
source: my uncle phone carrier center jejej
here are my ¢2 cents...hope it helps you
Sent from my €PΩ 4G TØU©H

[q] which processor is better !

I have some question on my mind before i buy new Android Smartphone !
please guide me about Processor
Like MTK / Cortex / Qualcomm / Scorpion / Snapdragon
may be its a brand but which one is better if they all is 1 GHz ?
or if they all Running on 1 GHz that means they all Dual Core or Single Core?
personally ask which one is better Processor ?
MTK6573 With PowerVR 531 GPU
or
Qualcomm Scorpian With Adreno 200 GPU
both are 1 GHz !
Dual core > single core
Sent from the foundations of the southeast corner of Wayne Manor.
Both if both dualcore then which one is good? If both singlecore then which one is good?
Sent from my A56 using xda premium
Adreno 200 is a bit old for these days. All the processors have their pros and cons. It's up to you which one you buy. I prefer Qualcomm. But its not always the better choice
that means PowerVR SXG531 GPU is better then adreno 200 right ?

Categories

Resources