Vorke Z3 RK3399 Powered 4K Smart TV Box with Type C SATA Interface - Android Stick & Console AMLogic based Computers

Rumours about RK3399 chipset from Rockchip have swept over the internet, and we seem to be barely a soundbite apart from it. Promised to rival the quarter-century old way we think about TV BOX, Vorke has included the RK3399-powered Z3 in its ambitious plans. Next February is expected to mark the birth of the revolutionary gadget.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Adopting Rockchip RK3399, Vorke Z3 is about to outperform almost all of its competitors in market. Unveiled at CES 2016, RK3399 is a Hexa Core processor based on Dual-core ARM Cortex-A72 MPCore and Quad-core ARM Cortex-A53 MPCore framework paired with Mali-T860MP4 Quad CoreGPU. Reportedly, it is about to offer a significant performance boost against its predecessors, including RK3288, outranking SoCs from Amlogic and Allwinner. Scored 72483 in Antutu, What a madness!
Unlike other contenders, z3 offers a staggering free storage of 4GB LPDDR3, 32GB of EMMC to back up your TV entertainment, so that you can use it right out of the box. Along side the storage, its worth to point out that Vorke brings together compelling KODI entertainment center and RK3399`s insane 4K image quality into one nifty box. The joining of the Soc conduces to its support for H.265 and VP9 4K video decoding.
One of the problems with any android box comes to that it entails optimal internet connection and speed to stream the bandwidth hogging that video can take. In account of this, Z3 equips itself with AC wifi, the newest WiFi protocol and has a high speed data transfer speed up-to 1200Mbps under some circumstances. Besides, the support for Gigabit Ethernet is admittedly an afterthought.
When it comes to extension possibilities, there is no prospect of compromise. With 1 USB 2.0 , 1 USB3.0, 1 Type-C and 1 SATA placed at the sides, Z3 enables you to attach USB peripherals or even more storage. The aforementioned Type C port is another highlight in Android 6.0 box, and when you factor in the sheer number of connections on Z3, you begin to realize why it is a little bit bigger than Z1. With support for SATA, USB 2.0, USB 3.0, Gigabit Ethernet, SPDIF, HDMI 2.0, there are few devices you won’t be able to connect to the device.
What to Expect?
Rockchip RK3399
Android 6.0
4GB + 32GB
AC WIFI + Gigabit Ethernet
4K VP9
USB3.0, Type C and SATA interface

Just got confirmation that my Z1 is supposed to arrive in 3-5 days .......... So any idea what the price will be on this puppy the Z3???? When it will launch??? Antu score is badasszz .

How much was your Z1?
I;m guessing the Z3 will be about 40% t0 50% dearer.
Z1 in some webshops is US$ 120 - I'd expect the Z3 to be about $160-180
It looks like a decent unit - but also look around here, Antutu is NOT a reliable bench to quote.
Some have reported a Xperia z3 (if my memory serves me) as doing 71000 in Antutu, others say it should be about 50k.

Fluffbutt said:
How much was your Z1?
I;m guessing the Z3 will be about 40% t0 50% dearer.
Z1 in some webshops is US$ 120 - I'd expect the Z3 to be about $160-180
It looks like a decent unit - but also look around here, Antutu is NOT a reliable bench to quote.
Some have reported a Xperia z3 (if my memory serves me) as doing 71000 in Antutu, others say it should be about 50k.
Click to expand...
Click to collapse
Z1 is on sale at $74.99 .
Z3 is expected to twice the price of Z3.
We will take on CPU Z .
Some devices did some optimizations on Antutu Benchmark, so that they score pretty high.

linhuizhen said:
Z1 is on sale at $74.99 .
Z3 is expected to twice the price of Z3.
We will take on CPU Z .
Some devices did some optimizations on Antutu Benchmark, so that they score pretty high.
Click to expand...
Click to collapse
I had an email from Vorke, they say "under $200".. so i replied, "Hopefully under $150 as well" hahah! Hope it's not too dear!
What about that 3DMark bench, firestorm (or is it firestrike)... that seems to be a decent test, but it's mainly gpu, isn't it?

Fluffbutt said:
I had an email from Vorke, they say "under $200".. so i replied, "Hopefully under $150 as well" hahah! Hope it's not too dear!
What about that 3DMark bench, firestorm (or is it firestrike)... that seems to be a decent test, but it's mainly gpu, isn't it?
Click to expand...
Click to collapse
We will check :fingers-crossed:

As a slight sideways jump - I notice its competitor boxes both have heats sink + fan listed in specs - does anyone know if the Vorke is using active cooling?
I think I was right in a different forum's post - maybe the RK chip runs a little hotter than passive cooling can deal with?

So got my Z1 here in No.VA Mon. 16 ordered Jan. 5 at geekbuying. After three days pretty happy. Games I haven't been able to play because of bricked MINIX X8-H, & not being able to root MiBox so PS3 SIXAXIS controller could work run flawlessly on Z1 . Showbox runs smooth ,YouTube,kodi 16.1, Antu 3D score no tweeks 41723. Jide has crowd funded Rockchip 3399 TV box for March or May ,USD $ 99-129

I don't trust a company that need kick-starter/crowd funding to develop a device - they smell like "fly by nighters" to me... produce the device, rake in some dosh, run like buggery when people start to complain or what tech support.
That Mio/MixIO whatever it's called... US$129 on kick-starter... mope, zero trust...
Two things I like about Vorke - they exist as a company, self-funded development... and they responded to my silly queries; pre-sales support suggest good after sales support.

Fluffbutt said:
I don't trust a company that need kick-starter/crowd funding to develop a device - they smell like "fly by nighters" to me... produce the device, rake in some dosh, run like buggery when people start to complain or what tech support.
That Mio/MixIO whatever it's called... US$129 on kick-starter... mope, zero trust...
Two things I like about Vorke - they exist as a company, self-funded development... and they responded to my silly queries; pre-sales support suggest good after sales support.
Click to expand...
Click to collapse

Just out of interest, I found a US site with a price for the UGOOS UT5 (basically the same as the Vorke Z3) -- $349!!!
What an absolute joke - you could buy a half decent laptop and use that as a TV box, FFS!
I guess that pointless little AMOLED display add to the price.
Still, early days, it might just be a place-holder price.
Edit - why did that double post? Weird.

Fluffbutt said:
Just out of interest, I found a US site with a price for the UGOOS UT5 (basically the same as the Vorke Z3) -- $349!!!
What an absolute joke - you could buy a half decent laptop and use that as a TV box, FFS!
Click to expand...
Click to collapse
I agree. What's more I realy don't know what is all the fuss about RK3399? When for more than a year there is Amazon Fire TV available with MediaTek SoC MT8173 ([email protected] + [email protected]).
Maybe this is not so good forum to discuss that because MediaTek isn't fond of open source but people preffer working solution than one which you must fiddle all the time to make it work for a while .

Yes - but the AmFireTV or sort of locked, isn't it - own UI, low customisation, Amazon bias (understandable).
I've heard that the Vorke will be completely unlocked, rooted, open... maybe...
Anyway, the specs say different:
Qualcomm Snapdragon 8064 Quad Core 4x @ 1.7Ghz Qualcomm Adreno 320
MediaTek 8173C Quad Core 2x @ 2GHz & 2x @ 1.6Ghz GPU PowerVR Rogue GX6250
Click to expand...
Click to collapse
Neither of those will match the RK3399, and the Mali 850mp4 is a very good SoC GPU. Not "superb" or "the best2, but certainly good enough to nearly everything.
I do NOT like Antutu as a benchmark (it's heavily biased to certain chips) but the AFTV gets 53K while the RK3399 gets 73K

Fluffbutt said:
I do NOT like Antutu as a benchmark (it's heavily biased to certain chips) but the AFTV gets 53K while the RK3399 gets 73K
Click to expand...
Click to collapse
I would be more impressed if my RK3288 device couldn't do 62k in Antutu. AFAIK Antutu is quite strongly GPU biased which might indicate that RK3399 GPU is slower than RK3288 or there are problems with drivers.
Besides I preffer "clear" CPU benchmarks which can give me indication of Internet browsing performance. When I did some research about MT8173 more than year ago I found something like browser.geekbench.com/v4/cpu/compare/939975?baseline=1646914. (Note: Nexus 9 is spoofed identity of my RK3288 device). Then I was quite pleased with that performance improvement but devices with MT8173 came only with 2GB RAM which is to small amount for me. Even so RK3399 isn't realy more impressive to me than MT8173 (check: browser.geekbench.com/v4/cpu/compare/939975?baseline=993600) and we are more than a year after.

Fluffbutt said:
Just out of interest, I found a US site with a price for the UGOOS UT5 (basically the same as the Vorke Z3) -- $349!!!
What an absolute joke - you could buy a half decent laptop and use that as a TV box, FFS!
I guess that pointless little AMOLED display add to the price.
Still, early days, it might just be a place-holder price.
Edit - why did that double post? Weird.
Click to expand...
Click to collapse
Ugoos is a trusted brand.

Jagee said:
I would be more impressed if my RK3288 device couldn't do 70k Antutu score already. AFAIK Antutu is quite strongly GPU biased which might indicate that RK3399 GPU is slower than RK3288 or there are problems with drivers.
Besides I preffer "clear" CPU benchmarks which can give me indication of Internet browsing performance. When I did some research about MT8173 more than year ago I found something like browser.geekbench.com/v4/cpu/compare/939975?baseline=1646914. (Note: Nexus 9 is spoofed identity of my RK3288 device). Then I was quite pleased with that performance improvement but devices with MT8173 came only with 2GB RAM which is to small amount for me. Even so RK3399 isn't realy more impressive to me than MT8173 (check: browser.geekbench.com/v4/cpu/compare/939975?baseline=993600) and we are more than a year after.
Click to expand...
Click to collapse
We will try other benchmarks also. Recommend one please

linhuizhen said:
We will try other benchmarks also. Recommend one please
Click to expand...
Click to collapse
I overestimated a little Antutu score for my RK3288 device which should be 62k not 70k. Nevertheless I would realy like to see Vellamo and Octane benchmarks scores for RK3399.
My RK3288 device can do about 4800 points with Chrome in Vellamo.
Geekbench v4 score for RK3399 is rather well known. You can brag only if you beat 1600 single- and 3000 multi-core score . browser.geekbench.com/v4/cpu/993600

linhuizhen said:
Ugoos is a trusted brand.
Click to expand...
Click to collapse
I'm not disputing that - they have a good track record... but that doesn't stop $349 being too high for this device. And I give them mucho-credit for NOT trying to effing kickstarter the device!
But the amoled screen is really a non-item - what's the point, a TV box is stuck under the TV table; I don't even look at mine, just use it...
And what can a little screen show anyway apart from time or some form of channel display? Any more info would mean you;d have to get down on the floor, closer to it, to read it!
An amoled screen is perhaps $50 of that $150 over pricing ($200 is all I'd be paying for this spec TV box - for $350 you can get a full i7 with intel 520 gpu (400 GFlops) TV Box! The Mali gpu in the RK3399 is rated at about 90 GFLops.
It smells like a gimmick to make it stand out from the other 3 or 4 RK3399 boxes coming.
So I'm less knocking Ugoos themselves and more knocking their "vision" of the yet-to-come TV Box.
*********************************************************************************
Geekbench 1520 and 2840 isn't too bad - just 160 lower than that magical 3000 isn't to be sniffed at.
Mind you, even Geekbench can be misleading - GB 3 gives over 5300 for the Samsumng Galaxy Tab S2 9.7 (Qualcomm SoC). (some reviews say 4300, 4800, 5300 (rounded off))

Fluffbutt said:
An amoled screen is perhaps $50 of that $150 over pricing
It smells like a gimmick to make it stand out from the other 3 or 4 RK3399 boxes coming.
Click to expand...
Click to collapse
It might be a gimmic to make it stand out but yours estimation of the cost of such display is quite off. First. It isn't AMOLED display like in smartphones. Second it dosen't have similar size (5" 16:9 proportion).
I've seen OLED displays on MP3 players devices with prices lower than 30$.
Fluffbutt said:
Geekbench 1520 and 2840 isn't too bad - just 160 lower than that magical 3000 isn't to be sniffed at.
Mind you, even Geekbench can be misleading - GB 3 gives over 5300 for the Samsumng Galaxy Tab S2 9.7 (Qualcomm SoC). (some reviews say 4300, 4800, 5300 (rounded off))
Click to expand...
Click to collapse
3000 is just a number for multicore Geekbench v4 score which heavily depends from number of them. Even mentioned by you Qualcom SoC (MSM8976) in New Samsung Galaxy Tab S2 9.7 can get over 5000 points in Geekbench v3 with 8 cores (4xA57 + 4xA53) working simultanously (Source: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462). That doesn't necessary translate to better performance than even just 2xA73.
Another problem when comparing benchmarks is that some devices contains different hardware than previous "batches". Like new (Qualcomm MSM8976) and "old" Galaxy Tab S2 9.7 (Samsung Exynos 5433 Octa).
Another factor is used benchmarking software and OS. There is clear example: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462 when one test is for older design and with 32bit OS rather than with 64bit (AArch64). I don't want even start about overheating problems which might be per device variable.

Jagee said:
It might be a gimmic to make it stand out but yours estimation of the cost of such display is quite off. First. It isn't AMOLED display like in smartphones. Second it dosen't have similar size (5" 16:9 proportion).
I've seen OLED displays on MP3 players devices with prices lower than 30$.
3000 is just a number for multicore Geekbench v4 score which heavily depends from number of them. Even mentioned by you Qualcom SoC (MSM8976) in New Samsung Galaxy Tab S2 9.7 can get over 5000 points in Geekbench v3 with 8 cores (4xA57 + 4xA53) working simultanously (Source: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462). That doesn't necessary translate to better performance than even just 2xA73.
Another problem when comparing benchmarks is that some devices contains different hardware than previous "batches". Like new (Qualcomm MSM8976) and "old" Galaxy Tab S2 9.7 (Samsung Exynos 5433 Octa).
Another factor is used benchmarking software and OS. There is clear example: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462 when one test is for older design and with 32bit OS rather than with 64bit (AArch64). I don't want even start about overheating problems which might be per device variable.
Click to expand...
Click to collapse
Oh, I agree on the guesstimate - I have no real idea what an oled screen costs - your $30 is probably more valid than my $50 - but that make it worse, not better.
What else is there to make this over $150 more than the other RK3399 boxes - "under $200" stated by Vorke, $115 pre-sale price foe another, and so on.
It MIGHT be a "place holder" number - but it's working against them... I'm not going back to look see what the release price is, the $349 has put me off... especially if it's a "state a high price then a release over-price will sound better".
That's a marketing con, really - you state $349 before release then drop it to a still-too-high $299 or $259 after release and people will flock to it thinking they're getting a great deal. (Not saying they're doing it that way)
But the other RK3399 boxes? £115 pre-release, expected to be £140 after; Vokre has stated to me, in email, "definitely under $200".
And i FULLY agree with you about the benchies.
Perhaps the better benchies are the real-world ones... game FPS -- 5.2 for one SoC, 14.8 for another, 25.7 for a third, and so on.
****************
Still - I'm liking this 6 core box... it's better than almost everything I've looked at - allowing for the Tegra devices that are semi-locked (heavily controlled launchers and biased "features" (understandable, of course, like the Amazon devices).

Related

What do you know about the Tegra 3 SoC in the Asus Prime?

-The Tegra 3 SoC (System on a chip) is a combo of a microprocessor, a memory controller, an audio processor, a video encoder and a graphics renderer. It's designed and manufactured by Nvidia, world leader of graphics computing, making it's first appearance in the Asus Transformer Prime.
-The Tegra 3 SoC has 5 physical cores, but limited to performance of quad-cores. The 5th, lower power core, is activated only when the device is idle or handling low tasks, such as syncing and e-mail checking. So, power consumption is always kept to minimum when performance of the quad-core is not needed, ensuring longer battery life. Once you run a normal or higher-demanding task on the tablet, the 5th core shuts off automatically before the 4 main cores are activated. This is all the bios of the chip and doesn't require the user or the developer to change anything to use the Android OS and application this way. Android OS already has a the best support for multi-tasking and is multi-threaded friendly compared to competing operating systems in the market. So, this should be good news of the Asus Transformer Prime to-be users soon.
-The GPU (Graphics Processing Unit) in the Tegra 3 Soc has 12 shaders. But, because Nvidia has not followed a unified-shader architecture in this ARM SoC like they've been doing in their PC and MAC discrete graphics cards, 8 of those 12 shaders are reserved for pixel work and the remaining 4 are for vertex. Maybe Nvidia will use unified-shader architecture in the next generation Tegra SoC'es, when the ARM-based devices are ready for it. The PowerVR MP2 GPU in the iPad 2 has more raw power than the Tegra 3 GPU (Actually, it's the only one thing I personally like about the iPad 2, it's GPU!), but the Tegra 3 Geforce (the commercial name Nvidia uses for their gaming graphics processors) should give a solid 3D performance in games, especially the officially supported games. Nvidia has long history in 3D gaming and been using it's solid connections with game developers to bring higher quality gaming to Android, like what we've seen with Tegra 2 SoC capabilities in games listed in the TegraZone Android app. Add to that, games are not just GPU bound, Tegra 3's quad-cores and 1GB system RAM (iPad has 512MB) will pump up gaming qualities for sure and the pixel density of 149ppi displays crisper images than the 132ppi of the iPad 2. Once the Asus Prime is released, it can be officially considered the highest performing Android device in the world, especially 3D gaming.
Well, I thought I'd have more to type, I paused for a long time and could not think of anything to add. I only wanted to share few things I know about the Tegra 3. I have high interest in computer graphics/processors and been following the Tegra project since 2008.
Some of the Asus Prime to-be-owners doesn't know or care that much about technical details of the CPU in the device and I thought of sharing with them.
Thanks and gold luck.
Thanks for the info. Very interesting
As I understand it, the use of the lower power 5th core has decreased battery consumption by over 60% when compared to the earlier 2 core design. I am not sure how they are measuring consumption and the task load.
I am most exited about the tablet because of the tegra 3.
In smartphones I find the idea of putting more than one core quite rubbish.
It is not the best solution for a tablet or any other mobile device too. I would highly appreciate a well programmed software over overpowered hardware.
Yet the tegra has a nice concept.
I think most of the time I won't use more than that 5th core. I mean it is even powerful enough to play HD video.
I will primarily use apps that display text and images. Like the browser who is said to utilize 4 cores. But I am sure only because of the crappy programming.
So if people finally come to their minds and start optimizing their apps we will have one quite powerful core and 4 in backup for REAL needs. Seems like an investment in the future for me.
Sent from my Nexus One using XDA App
Straight from Wikipedia:
Tegra 3 (Kal-El) series
Processor: quad-core ARM Cortex-A9 MPCore, up to 1.4 GHz single-core mode and 1.3 GHz multi-core mode
12-Core Nvidia GPU with support for 3D stereo
Ultra low power GPU mode
40 nm process by TSMC
Video output up to 2560×1600
NEON vector instruction set
1080p MPEG-4 AVC/h.264 40 Mbps High-Profile, VC1-AP and DivX 5/6 video decode[18]
The Kal-El chip (CPU and GPU) is to be about 5 times faster than Tegra 2[19]
Estimated release date is now to be Q4 2011 for tablets and Q1 2012 for smartphones, after being set back from Nvidia's prior estimated release dates of Q2 2011,[20] then August 2011,[21] then October 2011[22]
The Tegra 3 is functionally a quad-core processor, but includes a fifth "companion" core. All cores are Cortex-A9s, but the companion core is manufactured with a special low power silicon process. This means it uses less power at low clock rates, but more at higher rates; hence it is limited to 500 MHz. There is also special logic to allow running state to be quickly transferred between the companion core and one of the normal cores. The goal is for a mobile phone or tablet to be able to power down all the normal cores and run on only the companion core, using comparatively little power, during standby mode or when otherwise using little CPU. According to Nvidia, this includes playing music or even video content.[23]
Tegra 3 officially released on November 9, 2011[/LEFT][/CENTER][/FONT]
Tegra 2's maximum ram limit was 1GB. Tegra 3's could be 2GB.
xTRICKYxx said:
Straight from Wikipedia:
Tegra 2's maximum ram limit was 1GB. Tegra 3's could be 2GB.
Click to expand...
Click to collapse
The rumor mill is churning out some specs on an upcoming Lenovo tablet with some funky specs, like 2GB DD3....so it's possible. However, the same leak/article also says its chip is clocked at 1.6 Ghz which is quite a bit out of spec, so I would take it with a usual:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
jerrykur said:
As I understand it, the use of the lower power 5th core has decreased battery consumption by over 60% when compared to the earlier 2 core design. I am not sure how they are measuring consumption and the task load.
Click to expand...
Click to collapse
You can read the White Papers on the Tegra 3 over on Nvidia's website. But the chip has a controller built into the chip that activates either the 4 cores, or the 1 core based on power demand of a given processing activity.
The quad vs single core are made out of different silicone materials, but same design structure in order to maximize the energy efficiency at the performance curve. The difference of Materials is more efficient at different power curves. So the 5th core is very efficient at low processing levels where it is actively being used.
It's pretty cool stuff
RussianMenace said:
The rumor mill is churning out some specs on an upcoming Lenovo tablet with some funky specs, like 2GB DD3....so it's possible. However, the same leak/article also says its chip is clocked at 1.6 Ghz which is quite a bit out of spec, so I would take it with a usual:
Click to expand...
Click to collapse
*Correction: Tegra 3 supports DDR2 AND DDR3. The original Transformer had 1GB of DDR2 @ 667Mhz. The Prime has 1GB of LPDDR2 @ 1066Mhz, a considerable bump in speed. Also, Tegra 3 supports up to DDR3 @ 1500Mhz!
xTRICKYxx said:
I think the only compatible RAM would be DDR2. Clock speeds don't matter, as the Tegra 3 can be OC'd to 2Ghz no problem.
Click to expand...
Click to collapse
I'm sure it can, hopefully they increase the battery capacity to compensate for increased power use. As far as the memory, Nvidia's site on Tegra 3 lists DDR3 (though its still running on a 32-bit bus which may or may not be an issue with 3d games), upto 2GB. However, every bit of spec info on the Prime I can find lists DDR2...so I don't know.
RussianMenace said:
I'm sure it can, hopefully they increase the battery capacity to compensate for increased power use. As far as the memory, Nvidia's site on Tegra 3 lists DDR3 (though its still running on a 32-bit bus which may or may not be an issue with 3d games), upto 2GB. However, every bit of spec info on the Prime I can find lists DDR2...so I don't know.
Click to expand...
Click to collapse
The Prime's RAM speed is considerably faster than the TF101.
If it does have room to expand, could we expand or upgrade the RAM?
doeboy1984 said:
If it does have room to expand, could we expand or upgrade the RAM?
Click to expand...
Click to collapse
Judging by the pictures, it doesn't look like the RAM will be removable or upgradeable (the RAM is the Elpida chip right next to the processor).
xTRICKYxx said:
The Prime's RAM speed is considerably faster than the TF101.
Click to expand...
Click to collapse
I never said it wasn't.
What I said is that both Tegra 2 and now Tegra 3 have a single 32-bit wide memory interface when compared to the two on the A5,Exynos,Qualcom, and OMAP4 chips. What that means is that theoretically it will have lower bandwidth which may cause problems with upcoming games, especially considering that you now have to feed extra cores and a beefier GPU. Now, whether or not it will actually be an issue...we will have to see.
Sad that the SGX543MP2 in the Ipad2 is still faster than the Tegra3's GPU. Apple is always ahead of the curve.. Just when Android devices started becoming as fast as the iPad1.. The iPad2 was released and remains to have one of the strongest SOCs out in the field.
Even for pure CPU benches.. the 1ghz dualcore A5 smokes most chips running faster clocks in dual core configs.
Regardless, this is still the most powerful Android device to date. Just disappointed that Nvidia, one of the king of GPU makers can't even compete with PowerVR.. a much smaller company with a lot less money.
Diversion said:
Sad that the SGX543MP2 in the Ipad2 is still faster than the Tegra3's GPU. Apple is always ahead of the curve.. Just when Android devices started becoming as fast as the iPad1.. The iPad2 was released and remains to have one of the strongest SOCs out in the field.
Even for pure CPU benches.. the 1ghz dualcore A5 smokes most chips running faster clocks in dual core configs.
Regardless, this is still the most powerful Android device to date. Just disappointed that Nvidia, one of the king of GPU makers can't even compete with PowerVR.. a much smaller company with a lot less money.
Click to expand...
Click to collapse
I would have to agree with you that Nvidia dropped the ball on their new GPU, at least on paper.
However, it's not as simple as having "omg wtf i > you" hardware that's the source of the performance. What Apple really has going for them is uniformity of hardware/software. Apple software is designed to work on very specific and strictly controlled hardware setup which allows for an incredible level of optimizations of software. This "closed loop" if software/hardware is what really drives the performance of the iProducts. Simply put, probably way over-simplified, but it let's them do more with less.
Diversion said:
Sad that the SGX543MP2 in the Ipad2 is still faster than the Tegra3's GPU. Apple is always ahead of the curve.. Just when Android devices started becoming as fast as the iPad1.. The iPad2 was released and remains to have one of the strongest SOCs out in the field.
Even for pure CPU benches.. the 1ghz dualcore A5 smokes most chips running faster clocks in dual core configs.
Regardless, this is still the most powerful Android device to date. Just disappointed that Nvidia, one of the king of GPU makers can't even compete with PowerVR.. a much smaller company with a lot less money.
Click to expand...
Click to collapse
Very good point. Also apple has the apps n games that showcase and utilize all this extra power. Even my original iPad has apps/games that I haven't seen Android dual core equivalents of. I love my iPad but I also own Atix dual core Tegra 2 phone. I know the open sourced Android will win out in the end.
I came across a good comment in the lenovo specs link that a member here posted in this thread.
"Google and NVidia need to seriously subsidize 3rd party app development to show ANY value and utility over iPad. Apple won't rest on its laurels as their GPU performance on the A5 is already ahead with games and APPs to prove it".
What do you all think about this? Not trying to thread jack as I see it's relevant to this thread also. What apps/games does Android have up it's sleeve to take advantage of this new Tegra3? Majority of Android apps/games don't even take advantage of tegra2 and similar SOC yet. Are we going to have all this extra power for a while without it never really being used to it's potential. Android needs some hardcore apps n games. iPad has all the b.s. Stuff also BUT has very hardcore apps n games also to use it to close to full potential. IMO my iPad 1 jail broken still trumps most of these Tegra 2 tablets out now. Not because of hardware specs, but because of the quality of apps n games I have. I've noticed Android is finally starting to get more hardcore games like ShadowGun, game loft games, etc.. I can't over clock or customize my iPad as extensively as Android but the software/apps/games I have are great. No, I don't want an ipad2 or ipad3. I want an Android tablet now because of more potential with it. Just like with anything in life, potential doesn't mean sh$& if it's not utilized and made a reality.
I was a windows mobile person first. Then I experienced dual booting with XDAndroid on my tilt 2, I loved it. Then I knew I wanted a real android phone or tablet. First Android tablet I owned, for only a day, was the Archos7IT. It was cool but returned it since it couldn't connect to my WMwifirouter, which uses ad-hoc network. So I researched n finally settled on taking a chance with the apple iPad. I use to be an apple hater to the max..lol. My iPad changed all of that. I still hate the closed system of apple but I had to admit, the iPad worked great for what I needed and wanted to do. This iPad, I'm writing this post on now, still works flawlessly after almost 2 years and it's specs are nowhere compared to iPad 2 or all these new dual core tablets out. I'm doing amazing stuff with only 256mb of ram..SMH I hated having to hook iPad up to iTunes for everything like music n videos. So I jail broke and got Ifiles, which is basically a very detailed root file explorer. I also have the USB n SD card adapter. So now I could put my content on my iPad myself without needing to be chained to iTunes. iTunes only good for software updates. I'm still on 4.2.1 jail broken firmware on iPad. Never bothered or really wanted to upgrade to the new IOS 5.01 out now. With all my jailbreak mods/tweaks, I've been doing most new stuff people are now being able to do. All apple did was implement jailbreak tweaks into their OS, for the most part.
Sorry for the long rant. I'm just excited on getting new Prime tegra3 tablet. I just hope the apps/games start rolling out fast that really take advantage of this power. And I don't just mean tegrazone stuff..lol. Android developers going to have to really step their game up once these new quad cores come out. Really even now with dual cores also. I'm a fan of technology in general. Competition only makes things better. Android is starting to overtake apple in sales or similar categories. Only thing is Android hasn't gotten on par with apple quality apps yet. Like the iPad tablet only apps are very numerous. Lots are b.s. But tons are very great also. I'm just hoping Amdroid tablet only apps will be same quality at least or better. I'm not looking to get new quad core tablet to play angry birds or other kiddy type games. I'm into productivity, media apps, and hardcore games, like Rage HD, NOVA2, Modern Combat 3, Order n Chaos, InfinityBlade, ShadowGun, etc.. All of which I have and more on my almost 2 year old iPad 1.
Asus, with being the first manufacturer to come out with quad core tablet and super IPS + display, might just be the last push needed to get things really rolling for Android, as far as high quality software amd tablet optimized OS goes. Can't wait to see how this plays out .
---------- Post added at 01:00 PM ---------- Previous post was at 12:58 PM ----------
RussianMenace said:
I would have to agree with you that Nvidia dropped the ball on their new GPU, at least on paper.
However, it's not as simple as having "omg wtf i > you" hardware that's the source of the performance. What Apple really has going for them is uniformity of hardware/software. Apple software is designed to work on very specific and strictly controlled hardware setup which allows for an incredible level of optimizations of software. This "closed loop" if software/hardware is what really drives the performance of the iProducts. Simply put, probably way over-simplified, but it let's them do more with less.
Click to expand...
Click to collapse
Great point, just as I was saying basically in my long post..lol
nook-color said:
You can read the White Papers on the Tegra 3 over on Nvidia's website. But the chip has a controller built into the chip that activates either the 4 cores, or the 1 core based on power demand of a given processing activity.
The quad vs single core are made out of different silicone materials, but same design structure in order to maximize the energy efficiency at the performance curve. The difference of Materials is more efficient at different power curves. So the 5th core is very efficient at low processing levels where it is actively being used.
It's pretty cool stuff
Click to expand...
Click to collapse
That is correct. Actually, the "5th" core is licensed with ARM A7 instructions set, the quads are A9.
RussianMenace said:
I would have to agree with you that Nvidia dropped the ball on their new GPU, at least on paper.
However, it's not as simple as having "omg wtf i > you" hardware that's the source of the performance. What Apple really has going for them is uniformity of hardware/software. Apple software is designed to work on very specific and strictly controlled hardware setup which allows for an incredible level of optimizations of software. This "closed loop" if software/hardware is what really drives the performance of the iProducts. Simply put, probably way over-simplified, but it let's them do more with less.
Click to expand...
Click to collapse
Again, I agree. Just like saying why Xbox360 and PS3 consoles can still push high quality graphics compared to a new high-end PC? Unity of hardware plays a big role there.
I have a $4000 custom PC. Sometimes I see my brother play the same games on his $250 Playstation 3 with performance and graphics very similar to my PC.
CyberPunk7t9 said:
I have a $4000 custom PC. Sometimes I see my brother play the same games on his $250 Playstation 3 with performance and graphics very similar to my PC.
Click to expand...
Click to collapse
That's because these days, most PC games are console ports.
GPU specs don't matter. The iPad has more and better games than Android tabs, and that won't change for the (1-yr) lifespan of the Teg3. Not to be a downer, but it's just reality.
The Prime is better at certain things. HDMI-out and USB host (NTFS) support makes it a pretty good HTPC, for one. But I wouldn't get into a pissing contest over games--unless of course you're talking about emus.
e.mote said:
GPU specs don't matter. The iPad has more and better games than Android tabs, and that won't change for the (1-yr) lifespan of the Teg3. Not to be a downer, but it's just reality.
The Prime is better at certain things. HDMI-out and USB host (NTFS) support makes it a pretty good HTPC, for one. But I wouldn't get into a pissing contest over games--unless of course you're talking about emus.
Click to expand...
Click to collapse
Is that true? NTFS support? Are you sure? Can you link me to a spec for that? If so then I can transfer files from my SD to an external NTFS without using Windows! That would be great for trips when I need to dump digital pics.

Exynos 5250...

Hi geeks,
checkout this wiki for general information around the new Exynos dual core
http://www.arndaleboard.org/wiki/index.php/Resources
Of course this file might be of special interest...
BTW:
This document is marked as confidential but it's public available.
So Mike, if this is against rules... tell me!
Best regards,
scholbert
Antutu benchmark of Nexus 10
http://www.antutu.com/view.shtml?id=2960
Quite impressive because antutu depends much on the number of cores and clockrate rather than architechture (1.5 Ghz Snapdragon S3 got ~6600 while 1.4 Ghz Exynos 4210 on GNote had only ~6300)
And the NAND flash is pretty good too, 16.6MB/s write and >50 MB/s read (in fact my Note 2 has 200 point in SD card read with the mark >50 MB/s too, but this one is 10% faster)
I would love to see one of those fancy graphics comparing the Nexus10 performance with "the others".
We know that it's better but how much better?
And if you want more specs, here's two other benchmarks:
SunSpider:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
GLBenchmark:
If anyone is interested in the Antutu scores across the Nexus 4, 7 and 10 devices I've cut'n'paste them together from the Antutu site linked above...
These results are as recorded by Antutu himself at these links...
Nexus 10 - http://www.antutu.com/view.shtml?id=2960
Nexus 7 - http://www.antutu.com/view.shtml?id=1282
Nexus 4 - http://www.antutu.com/view.shtml?id=2940
The 3D and 2D scores on the Nexus 10 keeps up with the other two devices which seems quite impressive considering the higher resolution.
Keitaro said:
If anyone is interested in the Antutu scores across the Nexus 4, 7 and 10 devices I've cut'n'paste them together from the Antutu site linked above...
These results are as recorded by Antutu himself at these links...
Nexus 10 - http://www.antutu.com/view.shtml?id=2960
Nexus 7 - http://www.antutu.com/view.shtml?id=1282
Nexus 4 - http://www.antutu.com/view.shtml?id=2940
The 3D and 2D scores on the Nexus 10 keeps up with the other two devices which seems quite impressive considering the higher resolution.
Click to expand...
Click to collapse
Odd, why is the CPU on the Nexus 10 slower than the others? I thought that the A15 was supposed to be the fastest thing on the market right now, which would go nicely with the fastest GPU (Mali 604 or whatever it is).
Also, Scumbag Antutu forces the tablet into portrait. I would love it if Google could somehow force apps to run in landscape, apps should never be in portrait on a 16:10 tablet this big unless I deem it so by orienting it in portrait.
via Tapatalk
Kookas said:
Odd, why is the CPU on the Nexus 10 slower than the others? I thought that the A15 was supposed to be the fastest thing on the market right now, which would go nicely with the fastest GPU (Mali 604 or whatever it is).
Also, Scumbag Antutu forces the tablet into portrait. I would love it if Google could somehow force apps to run in landscape, apps should never be in portrait on a 16:10 tablet this big unless I deem it so by orienting it in portrait.
via Tapatalk
Click to expand...
Click to collapse
Need to keep in mind that the N10 is running at a MUCH larger resolution, that most likely has something to do with it. Had the processor been on the same device as the 4 and 7 you would see a substantial difference.
tkoreaper said:
Need to keep in mind that the N10 is running at a MUCH larger resolution, that most likely has something to do with it. Had the processor been on the same device as the 4 and 7 you would see a substantial difference.
Click to expand...
Click to collapse
But I wouldn't expect the load of that higher res to go to the CPU, just the GPU (so 2D and 3D scores). Does the CPU get involved in video processing in SoCs?
via Tapatalk
Does the Nexus label mean that all drivers for this device will be open source? As in, none of the BS that the devs for the i777 are experiencing with Samsung completely unwilling to release specs for the Exynos/Mali combo in that device?
EDIT: Answered my own question. The AOSP site itself tells you to go get the blobs for specific devices if you want to build. So no. Ah well, my concern would be fully functional OS updates, and the Nexus label DOES solve that - at least for a couple of years after release.
These I/O results look promising. A lot better then the transformer prime I had.
While it's nice to see numbers and you should always take them with a grain of salt (for obvious reasons). I for one am just going to wait till I have my Nexus 10 in my hands and see how she flies. I have no doubts that it will run todays apps with no issues and last you easily for a year +. I for one am drooling over the display (esp if non-pentile). Just hope Samsung did something to address the possiblity of pixel fatigue and ghosting/image retention.
Kookas said:
Odd, why is the CPU on the Nexus 10 slower than the others? I thought that the A15 was supposed to be the fastest thing on the market right now, which would go nicely with the fastest GPU (Mali 604 or whatever it is).
Also, Scumbag Antutu forces the tablet into portrait. I would love it if Google could somehow force apps to run in landscape, apps should never be in portrait on a 16:10 tablet this big unless I deem it so by orienting it in portrait.
via Tapatalk
Click to expand...
Click to collapse
Because it's ANTUTU benchmark. As I said in #2, Antutu always prefer number of cores and frequency of cores rather than the architechture.
That's why the Crapdragon S3 1.5 GHz having higher score than 1.4 GHz Exynos 4210 in GNote.
hung2900 said:
Because it's ANTUTU benchmark. As I said in #2, Antutu always prefer number of cores and frequency of cores rather than the architechture.
That's why the Crapdragon S3 1.5 GHz having higher score than 1.4 GHz Exynos 4210 in GNote.
Click to expand...
Click to collapse
What are you talking about. The snapdragon is a better architecture with less cores. You have it backwards.
Edit: thought you meant snapdragon s4.
Sent from my Galaxy Nexus using xda premium
Here are the benchmarks from Anand's review of Chromebook:
http://www.anandtech.com/show/6422/samsung-chromebook-xe303-review-testing-arms-cortex-a15/6
I am almost sure that the N10 will be better optimized compared to Chromebook purely because of the resources dedicated for Android. Also shows that Samsung is still Google's preferred partner in terms of hardware.
zetsumeikuro said:
While it's nice to see numbers and you should always take them with a grain of salt (for obvious reasons). I for one am just going to wait till I have my Nexus 10 in my hands and see how she flies. I have no doubts that it will run todays apps with no issues and last you easily for a year +. I for one am drooling over the display (esp if non-pentile). Just hope Samsung did something to address the possiblity of pixel fatigue and ghosting/image retention.
Click to expand...
Click to collapse
It's not an AMOLED display, it's a Super PLS (LCD), isn't it?
blackhand1001 said:
What are you talking about. The snapdragon is a better architecture with less cores. You have it backwards.
Sent from my Galaxy Nexus using xda premium
Click to expand...
Click to collapse
Do a research man! You can searxh how sh.t crapdragon s3 msm 8660 compared to exynos 4210 in the same galaxy note (US variant vs int variant)
philos64 said:
And if you want more specs, here's two other benchmarks:
SunSpider:
GLBenchmark:
Click to expand...
Click to collapse
That sun spider score is bad vs the chrome book must be the resolution which is high for even high end PCs.
The epic screen was always going to eat most of the resources, the question is: whatever benchmark the final (and in future surely improved) SW version produces, is it enough for smooth operation? That answer will most likely be yes. Chromebook shows huge HW potential, but it's also more optimized at this moment, patience my lads.
The hardware has the potential
Hi Guys,
Came across this while researching Exynos 5250. Looks like the hardware is very capable to handle the WQXGA resolution with memory bandwidth and power to spare. This white paper also mentions the support for 1080p 60fps wireless display. So I hope Miracast will be reality as well, just Google needs to step up and utilize the hardware to its full potential. Its an interesting read none the less..
Sorry, can not post links yet.. replace _ with . and then try.
www_samsung_com/global/business/semiconductor/minisite/Exynos/data/ Enjoy_the_Ultimate_WQXGA_Solution_with_Exynos_5_Dual_WP.pdf
---------- Post added at 04:26 AM ---------- Previous post was at 04:20 AM ----------
oneguy7 said:
Hi Guys,
Came across this while researching Exynos 5250. Looks like the hardware is very capable to handle the WQXGA resolution with memory bandwidth and power to spare. This white paper also mentions the support for 1080p 60fps wireless display. So I hope Miracast will be reality as well, just Google needs to step up and utilize the hardware to its full potential. Its an interesting read none the less..
Sorry, can not post links yet.. replace _ with . and then try.
www_samsung_com/global/business/semiconductor/minisite/Exynos/data/ Enjoy_the_Ultimate_WQXGA_Solution_with_Exynos_5_Dual_WP.pdf
Click to expand...
Click to collapse
If the link does not work, google exynos 5 dual white paper.
I ran Quadrant and compared the results with those of my old Epic 4g.
Epic 4g Graphics (3d) score is 1666. N10 Graphics (3d) score is 2087. See below.
Epic 4g, rooted, FC09 MTD deodexed, Shadow kernel io/cpu=deadline/ondemand
Nexus 10, not rooted

iPad 4 vs 5250 (Nexus 10 Soc) GLBenchmark full results. UPDATE now with Anandtech!!

XXXUPDATEXXX
Anandtech have now published the perfromance preview of the Nexus 10, lets the comparison begin!
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Well, the first full result has appeared on GLBenchmark for the iPad4, so I have created a comparison with the Samsung Arndale board, which uses exactly the same SoC as the Nexus 10, so will be very close in performance to Google's newest tablet. GLBenchmark, as its name suggests test Open GL graphics perrformance, which is important criteria for gaming.
Which device wins, click the link to find out.
http://www.glbenchmark.com/compare....ly=1&D1=Apple iPad 4&D2=Samsung Arndale Board
If you're really impatient, the iPad 4 maintains it lead in tablet graphics, the Nexus 10 may performance slightly better in final spec, but the underlying low-level performance will not change much.
I've also made a comparison between the iPad 3 & 4.
Interestingly the in-game tests GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) :, which is run independent of native screen resolution show the following
iPad 4: 48.6 FPS
iPad 3: 25.9 FPS
5250 : 33.7 FPS
So the iPad is twice as fast as its older brother, the Exynos will probably score nearer 40 FPS in final spec, with new drivers and running 4.2, the board runs ICS, however Jelly Bean did not really boost GL performance over ICS. What is interesting is that iPad 4, whose GPU is supposed to clocked at 500 MHz vs 250 MHz in the iPad 3 does not perform twice as fast in low-level test.
Fill Rate, triangle throughtput, vertex output etc is not double the power of the iPad 3, so although the faster A6 cpu helps, I reckon a lot of the improvement in the Egypt HD test is caused by improved drivers for the SGX 543 MP4 in the Pad 4. The Galaxy S2 received a big jump in GL performance when it got updated Mali drivers, so I imagine we should see good improvements for the T604, which is still a new product and not as mature as the SGX 543.
http://www.glbenchmark.com/compare....tified_only=1&D1=Apple iPad 4&D2=Apple iPad 3
I'd imagine the new new iPad would take the lead in benchmarks for now as it'll take Sammy and Google some time to optimize the beast, in the end however actual app and user interface performance is what matters, and reports are overwhelmingly positive on the Nexus 10.
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
Click to expand...
Click to collapse
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Damn..now I have to get an iPad.
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
Click to expand...
Click to collapse
In areas where the Mali 400 lacked performance, like fragment and vertex lit triangle output T604 is comfortably 5 x the performance, whereas in these low-level tests iPad is not a concrete 2x the power of iPad 3, but achieves twice the FPS in Egypt HD than its older brother. I suspect drivers are a big factor here, and Exynos 5250 will get better as they drivers mature.
hot_spare said:
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
Click to expand...
Click to collapse
I agree with most of what you have said. On the GPixel figure this is like ATI GPU teraflops figures always being much higher than Nvidia, in theory with code written to hit the device perfectly you might see that those high figures, but in reality the Nvidia cards with lower on paper numbers equaled or beat ATI in actual game FPS. It all depends on whether the underlying architecture is as efficient in real-world tests, vs maximum technical numbers that can't be replicated in actual game environments.
I think the current resolution of the iPad / Nexus 10 is actually crazy, and will would see prettier games with lower resolutions, the amount of resources needed to drive those high MP displays, means lots of compromises will be made in terms of effects / polygon complexity etc to ensure decent FPS, especially when you think that to drive Battlefield 3 at 2560 x 1600 with AA and high textures, requires a PC that burn 400+ watts of power, not a 10 watt SoC.
Overall when we consider that Nexus 10 has twice the RAM for game developers to use and faster CPU cores, games should look equally as nice as both, the biggest effect will be the level of support game developers provide for each device, the iPad will probably be stronger i that regard. Nvidia was able to coax prettier games out of Tegra 3 through developer support, hopefully Google won't forget the importance of this.
What's the point of speculation? Just wait for the device to be released and run all the test you want to get confirmation on performance. Doesn't hurt to wait
BoneXDA said:
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Click to expand...
Click to collapse
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
I can't wait to see this Exynos 5250 in a 2.0ghz quad-core variant in the semi near future... Ohhhh the possibilities. Samsung has one hell of a piece of silicon on their hand.
Chrome
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Turbotab said:
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Click to expand...
Click to collapse
True.. Chrome on mobile is still not upto desktop level yet. I believe it's v18 or something, right? The stock browser would have much better result in SunSpider/Browsermark. The N4 numbers looks even worse. Somewhere the optimizations isn't working.
The GLbenchmark tests are weird. Optimus G posts much better result than N4 when both are same hardware. It infact scores lower than Adreno 225 in some cases. This is totally whacked.
For N10, I am still wondering about fill rate. Need to check what guys say about this.
Is it running some debugging code in the devices at this time?
Turbotab said:
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
Click to expand...
Click to collapse
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
hung2900 said:
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Click to expand...
Click to collapse
I have to disagree, this is from ARM's info site.
The ARM Cortex-A15 MPCore processor has an out-of-order superscalar pipeline with a tightly-coupled low-latency level-2 cache that can be up to 4MB in size. The Cortex-A15 processor implements the ARMv7-A architecture.
The ARM Cortex-A9 processor is a very high-performance, low-power, ARM macrocell with an L1 cache subsystem that provides full virtual memory capabilities. The Cortex-A9 processor implements the ARMv7-A architecture and runs 32-bit ARM instructions, 16-bit and 32-bit Thumb instructions, and 8-bit Java bytecodes in Jazelle state.
http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.set.cortexa/index.html
Keion said:
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
Click to expand...
Click to collapse
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Harry GT-S5830 said:
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Click to expand...
Click to collapse
Indeed it does,but not in offscreen testing, where Anand made his proclamation.
Sent from my iPad Mini using Tapatalk
Hemlocke said:
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Click to expand...
Click to collapse
Nah, I think we can beat that too.
Drivers + OC.

Rumour - Nexus 10 (Version 2)

Hi Guys,
I had a Nexus 10 for like a week but unfortunately it was faulty. Returned it and thought I'd wait a month for "New Stock" then re-purchase !
For the week I had it ... I loved it.....and now cant get any Nexus 10s from the Play Store - I'm just sat here waiting
I came across this article regarding a revised Nexus 10 device !! anyone heard anything else ?
Here's the article:
http://www.brightsideofnews.com/new...let-at-mwc-2013-quad-core2c-gpgpu-inside.aspx
Thought I'd throw it out there!
If the article is true - I'm gona be waiting for the revised version!
Peace Out
I'm not gonna hold my breath for any new Nexus device at MWC. I think the next thing we'll see is the successor to the Nexus 7 at Google I/O in May.
I don't know... I don't read many people complaining about the power of the tegra3 in the Nexus 7, while the article is correct in that pushing this resolution the Nexus 10 is a bit underwhelming on the performance side. I'd see them wanting a top performer 10" to one up the ipad4. They've beat the ipad4 in dpi (not color) if they can boost perf and hopefully adjust color values they'd do quite well.
I just see making these adjustments to a Nexus 10 (2) being more needed than any adjustments on the Nexus 7.
If true, would this solve the issue with missing Miracast?
Sent from my GT-I9100 using xda premium
I believe all the issues have to do with how little rigidity there is in the chassis. Gg2 is just too thin to stay flat with such a long narrow panel. Its the uneven backlight that makes light bleed and colors so bad. If the device were stiffer and/or had thicker glass I'm sure that the screen would be much better. Almost all the screen issues I believe stem from this.
Sent from my 5th and final Nexus 4
---------- Post added at 04:53 PM ---------- Previous post was at 04:50 PM ----------
To expand on that... The screen is not flat really. I think in their desire to make it light and thin that they didn't anticipate the screen sag. I had my screen replaced by Samsung and the screen was either pressed in too hard or was warped from the start because it had worse bleed than the original. If I were to grip the tablet from the edges and push back as if to snap the device in half, all bleed went away.
Sent from my 5th and final Nexus 4
I hope Google won't crap on early adopters like that and just work on optimizing software with the current hardware..... I have no more money to upgrade!
c0minatcha said:
Here's the article
Click to expand...
Click to collapse
With the combination of "brightsideofthenews.com" and "according to the people we spoke with" as attribution for the source in the article I'd say this is probably below the scale of a rumor.
As far as user experience goes for me so far, the tablet itself hasn't really slow me down anytime due to speed yet. Playing 1080p has been smooth... I like the 1600p display as that match the native resolution of my desktop monitor so that works really well for remote desktop..
I understand how Google want to take the tech leadership from apple. If that can somehow lead developers to start developing better apps than iOS then that would be so great. So tired of playing seomd fiddle to so many of these apps that have much better iOS over.
As far as the hardware goes, I would glad to pay a little extra if they can include 64 or 128 GB storage. Better yet, a micro sdxc slot would be nice.... Also a 4g over would be good. And if they can ever change the software to allow ad hoc network... I think these are more important than 8 core CPU GPU..
Jayrod1980 said:
I believe all the issues have to do with how little rigidity there is in the chassis. Gg2 is just too thin to stay flat with such a long narrow panel. Its the uneven backlight that makes light bleed and colors so bad. If the device were stiffer and/or had thicker glass I'm sure that the screen would be much better. Almost all the screen issues I believe stem from this.
Sent from my 5th and final Nexus 4
---------- Post added at 04:53 PM ---------- Previous post was at 04:50 PM ----------
To expand on that... The screen is not flat really. I think in their desire to make it light and thin that they didn't anticipate the screen sag. I had my screen replaced by Samsung and the screen was either pressed in too hard or was warped from the start because it had worse bleed than the original. If I were to grip the tablet from the edges and push back as if to snap the device in half, all bleed went away.
Sent from my 5th and final Nexus 4
Click to expand...
Click to collapse
It is not nearly as flexible as you make it sound. If this were truly the problem with the display then flexing the device would cause backlight shifts and screen distortions. This, however, is not the case in my experience.
wildpig1234 said:
As far as user experience goes for me so far, the tablet itself hasn't really slow me down anytime due to speed yet. Playing 1080p has been smooth
Click to expand...
Click to collapse
same here after some tweaking
its reference quality in playback and smoothness
also the screen is out of this world
i say it already after 1 month of use.
i have never seen a display producing "more" details and sharpness to 1080p blu ray material.
its amost that the N10 knows what color the pixel that is scaled up should have.
Blu Ray source material looks more like 4K on this screen.
why the hell would i change this for a rev2 ?
because NFSMW isnt running at 60fps?..nah..that **** should be played on a PC with a Wheel
also remember
performance improvements can come with updates in the near future.
you could allways tweak settings for specific games to run even better.
Nvidia has done it for a long time.
Google can also do it when they see that only 99% of the games runs at 60fps
my next Tablet upgrade will be when they runs in 120hz at 120FPS
not before.
Just reading through the above, the small use I had of the Nexus 10 was a great experience!
Ive been checking the stock in the UK and it hasn't been replenished...then I read that article....and thought as usual now should I wait (Not that I have any choice right now!)
Maybe instead of concentrating on the N10 v2 - they should maybe concentrate on getting the current version back in stock and their accessories in order for all their products !!
And to re-itterate the screen was super dope !!
Could the low stock indicate that they are working on a rev 2?
Sent from my GT-I9100 using xda premium
Performance wise this tablet is a milestone in the android world.... Dafuq i just read?!
Here is another place running the story. Sounds like copy/paste `journalism' to me.
http://nvonews.com/2013/01/21/new-g...imus-g2-ascend-w2expected-for-mwc-2013-specs/
More Powerful Google Nexus 10: It has been quite recently that we received the large 10-inch Nexus tablet from Google. The device, made by Samsung, has an amazing 2560 x 1600 display. This high resolution is highly praised as it beats both iPad 3 and iPad 4. But, the device lacks a powerful processor. Samsung used its Exynos 5 chipset inside a dual core Cortex-A15 CPU and a Mali T604-Class graphical processor.
Many customers find the device underpowered, because of its dual core CPU and less competent graphical processor. As per rumors, Google is likely to produce a quad core version for the device. Along with the CPU upgrade, the tablet will have an 8-core T628 graphical processor. The projected tablet will have a fresh Android version and the same 2GB RAM.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
borchgrevink said:
Could the low stock indicate that they are working on a rev 2?
Sent from my GT-I9100 using xda premium
Click to expand...
Click to collapse
I don't think so, but it would be great, this would lower their costs about RMA's since so many people returned their tablets.
Garbage
I doubt they will update it until a year after the first one came out, other then to add a simcard.
Incremental upgrade.....cool
Sent from my Nexus 10 using XDA Premium HD app
I'm sure Google is touting around a bunch of tablet prototypes and these guys might have even seen one but I doubt it will be a Revision 2 of the same product. It will probably be the followup model and we'll see it in about a year from now which is fine and expected. The only thing that makes this somewhat plausible is that Google is the worst company in the world at keeping secrets. Or they are the most genius speculation generating company in existence.
I can't see moving off the N10 (personally) in a year.. The mobile markets technology advancement is moving at a disgusting rate though. I love bleeding edge technology and such but they need to slow it down a notch.
You guys certainly are an emotional lot.
First, Exynos 5 Dual is more than capable of running a 10" tablet. It's a more modern architecture than Exynos 4 Quad with more memory bandwidth. It's on par with S4 Pro which is the other new-gen SoC being heavily used right now.
Samsung's Exynos 5 Dual integrates two ARM Cortex A15 cores running at up to 1.7GHz with a shared 1MB L2 cache. The A15 is a 3-issue, Out of Order ARMv7 architecture with advanced SIMDv2 support. The memory interface side of the A15 should be much improved compared to the A9 [Exynos 4]. The wider front end, beefed up internal data structures and higher clock speed will all contribute to a significant performance improvement over Cortex A9 based designs. It's even likely that we'll see A15 give Krait a run for its money, although Qualcomm is expected to introduce another revision of the Krait architecture sometime next year to improve IPC and overall performance. The A15 is also found in TI's OMAP 5. It will likely be used in NVIDIA's forthcoming Wayne SoC, as well as the Apple SoC driving the next iPad in 2013.​
Mali 604 is a huge leap above anything Samsung's created before. And if there's a quad-core version of Exynos 5 it'll use the exact same Mali GPU just as the dual and quad-core versions of S4 Pro share the same GPU.
Samsung's fondness of ARM designed GPU cores continues with the Exynos 5 Dual. The ARM Mali-T604 makes its debut in the Exynos 5 Dual in quad-core form. Mali-T604 is ARM's first unified shader architecture GPU, which should help it deliver more balanced performance regardless of workload (the current Mali-400 falls short in the latest polygon heavy workloads thanks to its unbalanced pixel/vertex shader count). Each core has been improved (there are now two ALU pipes per core vs. one in the Mali-400) and core clocks should be much higher thanks to Samsung's 32nm LP process. Add in gobs of memory bandwidth and you've got a recipe for a pretty powerful GPU. Depending on clock speeds I would expect peak performance north of the PowerVR SGX 543MP2 [iPad 3], although I'm not sure if we'll see performance greater than the 543MP4 [iPad 4].​http://www.anandtech.com/show/6148/samsung-announces-a15malit604-based-exynos-5-dual
So there's no rush to get a Rev2 of N10 out as the performance increase from Exynos Dual to Quad wouldn't be that dramatic and the GPU would be exactly the same. The h/w for the N10 is mostly provided by Samsung and the components are all latest generation. The display has an enormous amount of pixels but if your read the article the h/w is more than capable of supporting it. So any issues with performance will hopefully be addressed with s/w updates vs. the need for an emergency h/w update.
Do people still think that these random reboots are the hardware problem of N10? It is 4.2 problem. If you have 4.2 in Nexus 4 and Nexus 7, they all get these random reboots. I just think that people just return/replace their devices without doing any reading.
Version 2? It looks like it'd have better chipset but I don't know where does it stop. It's like Samsung developing Octa chipset. I just think that we need any more power than Snapdragon S4 Dual. That already had plenty of power and now we are holding Exynos 5 Dual A15 based. It's like with your desktop computer, you could have i7 6-core 12-thread 37XXk CPU. You bought it for $600 but you are probably never going to use all its power for years to come.

snapdragon 800 vs tegra 4?

hey guys so i was planing to get the tegra note tablet. but reading up i see the snapdragon 800 cpu is pretty good too,
what one do i pick !? i have around £200-300 mark, i cant find any snapdragon 800 Tablets out there, im a Pc gamer looking for a kind of gaming tablet. anyhelp would be amazing.
thanks 181jenkins
I'd go for tegra 4 if you're a gamer
Please Don't Forget To Press The Thanks Button!! :thumbup:
Either one will be fine... The snapdragon 800 is a beast also
baileyjr said:
Either one will be fine... The snapdragon 800 is a beast also
Click to expand...
Click to collapse
gahh im a performance guy i like to have the Highest in the benchmarks, thats why im was looking at the snapdragon 800 now, but the tegra 4 has 72 gpu cores... hate looking for tablets lol
Tegra 4 has less games than snapdragon
They are just same in performance
But going with s800 will be a wise choice due to high dev support
Go for galaxy note 3 it has s800 cpu with 3gb ram
Sent from my IM-A850K using Vegaviet App
baileyjr said:
Either one will be fine... The snapdragon 800 is a beast also
Click to expand...
Click to collapse
adeelraj said:
Tegra 4 has less games than snapdragon
They are just same in performance
But going with s800 will be a wise choice due to high dev support
Go for galaxy note 3 it has s800 cpu with 3gb ram
Sent from my IM-A850K using Vegaviet App
Click to expand...
Click to collapse
less games? how come? i thought the tegra 4 will have more "graphics" than the snapdragon?
im looking for a tablet for the 200-300 mark.
thanks 181jenkins
181jenkins said:
less games? how come? i thought the tegra 4 will have more "graphics" than the snapdragon?
im looking for a tablet for the 200-300 mark.
thanks 181jenkins
Click to expand...
Click to collapse
I would like to suggest a tab:
Try Google Nexus 7 (2013)
Please Don't Forget To Press The Thanks Button!! :thumbup:
adi160699 said:
I would like to suggest a tab:
Try Google Nexus 7 (2013)
Please Don't Forget To Press The Thanks Button!! :thumbup:
Click to expand...
Click to collapse
i have the nexus 7 2013 version, and i want to move away, i mean i went to the nexus 7 2012 version for the performance, and it seams they ate going away from that and going to more of the " screen resolution" side of things. i mean i dont really care about the resolution, as the higher it is the more fps it will take away from performance,
thanks 181jenkins.
Consider the Tegra Note or Shield
181jenkins said:
i have the nexus 7 2013 version, and i want to move away, i mean i went to the nexus 7 2012 version for the performance, and it seams they ate going away from that and going to more of the " screen resolution" side of things. i mean i dont really care about the resolution, as the higher it is the more fps it will take away from performance,
thanks 181jenkins.
Click to expand...
Click to collapse
I am waiting on a Tegra Note for $199 (~150GBP I suspect). Frame rates should be nearly 2x that of the Nexus 7 2013, which has 2.5x the number of pixels and a weaker GPU. The HP Slate 7 Extreme may be the first variant available. It also has computational photography featires (real-time HDR, track-focus, and slow-motion with a 5MP camera), very good sound, and a highly accurate stylus. Recent builds have shown Antutu scores of >36K (see YouTube Chinese review), versus ~20K for the highest N7 2013 scores. And yes, that is at least as high as the Sony Xperia Xperia Z Ultra phablet which uses an S800.
The S800 in theory has a stronger GPU but a weaker CPU. The real picture is clouded by the fact that certain vendors (see arstechnica.com_gadgets_2013_10_galaxy-note-3s-benchmarking-adjustments-inflate-scores-by-up-to-20 - replace underscores with slashes) have gamed the benchmarks when using the S800. The T4 has generally shown higher benchmarks than the original reference builds (e.g. Antutu on my Shield is 41.5k vs 36.5k), whereas the S800 has yet to beat their reference design performance. Of course, their reference phone build was about a half-inch thick. So that might have something to do with it...
If you want ultimate FPS performance, get the Shield. It actually has a (very quiet) fan so the device can run at close to 100% capacity for hours - something no phone or tablet can do, and there is no benchmark gaming as confirmed by Ars Technica. The Tegra Note would be a good second choice. After that, things get murkier, with the Sony S800-based Xperia Z Ultra being a decent choice. However, it's stylus looks quite inferior to the Note (no pressure sensitivity, for example), and the higher PPI will translate into slower FPS. The camera is 8MP vs 5MP for the note, but the Z doesn't the same real-time capabilities, AFAIK.
Some people claim that the T4 is a "power hog", and Toshiba's poorly engineered T4 tablet heat-sink hasn't helped this view. But Laptop Magazine has shown the Shield running 4.2W TDP for the T4 at 1.9Ghz under hours of heavy use, which compares favorably to the S800 design TDP (see fudzilla.com_home_item_31532-qualcomm-aims-at-25-to-3w-tdp-for-phones).
Hope that helps!
deppman said:
I am waiting on a Tegra Note for $199 (~150GBP I suspect). Frame rates should be nearly 2x that of the Nexus 7 2013, which has 2.5x the number of pixels and a weaker GPU. The HP Slate 7 Extreme may be the first variant available. It also has computational photography featires (real-time HDR, track-focus, and slow-motion with a 5MP camera), very good sound, and a highly accurate stylus. Recent builds have shown Antutu scores of >36K (see YouTube Chinese review), versus ~20K for the highest N7 2013 scores. And yes, that is at least as high as the Sony Xperia Xperia Z Ultra phablet which uses an S800.
The S800 in theory has a stronger GPU but a weaker CPU. The real picture is clouded by the fact that certain vendors (see arstechnica.com_gadgets_2013_10_galaxy-note-3s-benchmarking-adjustments-inflate-scores-by-up-to-20 - replace underscores with slashes) have gamed the benchmarks when using the S800. The T4 has generally shown higher benchmarks than the original reference builds (e.g. Antutu on my Shield is 41.5k vs 36.5k), whereas the S800 has yet to beat their reference design performance. Of course, their reference phone build was about a half-inch thick. So that might have something to do with it...
If you want ultimate FPS performance, get the Shield. It actually has a (very quiet) fan so the device can run at close to 100% capacity for hours - something no phone or tablet can do, and there is no benchmark gaming as confirmed by Ars Technica. The Tegra Note would be a good second choice. After that, things get murkier, with the Sony S800-based Xperia Z Ultra being a decent choice. However, it's stylus looks quite inferior to the Note (no pressure sensitivity, for example), and the higher PPI will translate into slower FPS. The camera is 8MP vs 5MP for the note, but the Z doesn't the same real-time capabilities, AFAIK.
Some people claim that the T4 is a "power hog", and Toshiba's poorly engineered T4 tablet heat-sink hasn't helped this view. But Laptop Magazine has shown the Shield running 4.2W TDP for the T4 at 1.9Ghz under hours of heavy use, which compares favorably to the S800 design TDP (see fudzilla.com_home_item_31532-qualcomm-aims-at-25-to-3w-tdp-for-phones).
Hope that helps!
Click to expand...
Click to collapse
I've got the Z Ultra as a daily driver and its very good... And the Snapdragon 800 cpu has been overclocked to 2.7ghz over in the Z1 forum. Not thats thats recommended lol
That tegra note looks really sweat gaming platform for the price though, I suspect they have kept the resolution down to keep the price down, but that resolution is perfect for better gaming performance. Stylus is a plus as well...
Its a pitty the tegra 4 hasn't made its way to more phones/phablets. I think they will shift far more tablets than Shields as it a more versatile device for more people.. Anyone know if it will have the streaming capability of the shield? I suspect not.
Bro go for xolo play tab tegra4 obvi nvidia is the best for gaming and also tegra is overclockable
Sent from my GT-I8552 using xda premium
deppman said:
I am waiting on a Tegra Note for $199 (~150GBP I suspect). Frame rates should be nearly 2x that of the Nexus 7 2013, which has 2.5x the number of pixels and a weaker GPU. The HP Slate 7 Extreme may be the first variant available. It also has computational photography featires (real-time HDR, track-focus, and slow-motion with a 5MP camera), very good sound, and a highly accurate stylus. Recent builds have shown Antutu scores of >36K (see YouTube Chinese review), versus ~20K for the highest N7 2013 scores. And yes, that is at least as high as the Sony Xperia Xperia Z Ultra phablet which uses an S800.
The S800 in theory has a stronger GPU but a weaker CPU. The real picture is clouded by the fact that certain vendors (see arstechnica.com_gadgets_2013_10_galaxy-note-3s-benchmarking-adjustments-inflate-scores-by-up-to-20 - replace underscores with slashes) have gamed the benchmarks when using the S800. The T4 has generally shown higher benchmarks than the original reference builds (e.g. Antutu on my Shield is 41.5k vs 36.5k), whereas the S800 has yet to beat their reference design performance. Of course, their reference phone build was about a half-inch thick. So that might have something to do with it...
If you want ultimate FPS performance, get the Shield. It actually has a (very quiet) fan so the device can run at close to 100% capacity for hours - something no phone or tablet can do, and there is no benchmark gaming as confirmed by Ars Technica. The Tegra Note would be a good second choice. After that, things get murkier, with the Sony S800-based Xperia Z Ultra being a decent choice. However, it's stylus looks quite inferior to the Note (no pressure sensitivity, for example), and the higher PPI will translate into slower FPS. The camera is 8MP vs 5MP for the note, but the Z doesn't the same real-time capabilities, AFAIK.
Some people claim that the T4 is a "power hog", and Toshiba's poorly engineered T4 tablet heat-sink hasn't helped this view. But Laptop Magazine has shown the Shield running 4.2W TDP for the T4 at 1.9Ghz under hours of heavy use, which compares favorably to the S800 design TDP (see fudzilla.com_home_item_31532-qualcomm-aims-at-25-to-3w-tdp-for-phones).
Hope that helps!
Click to expand...
Click to collapse
i was looking at the shield but im in the uk adn the price of getting it shipped would be £100 ish with import tax etc, i cant see the point in spending around £350 on a shield when it retails at around £250 mark :/ i wish i could get one, but for now im waiting for the T4 note, anyone have any idea on when its coming out !? i keep thinking about the T4 as the tega zone and better looking games.
thanks 181jenkins
More Tegra Note Info
The chinese reviews are below. Use google translate. Also, I had to replace '/' with ' ' to get around the no-link ban.
A good overview is at pie.pconline.com.cn 365 3650116.html. Notice there is a mistake - the Note 7 does have a front facing VGA cam. And here is a video (remove the spaces): youtube.com-watch ?v= 4xOL3MtXEPU.
The video shows good benchmark scores, and highlights the use of the stylus. Personally, I wish this damn thing would come out! I really want a tablet with a stylus!
deppman said:
The chinese reviews are below. Use google translate. Also, I had to replace '/' with ' ' to get around the no-link ban.
A good overview is at pie.pconline.com.cn 365 3650116.html. Notice there is a mistake - the Note 7 does have a front facing VGA cam. And here is a video (remove the spaces): youtube.com-watch ?v= 4xOL3MtXEPU.
The video shows good benchmark scores, and highlights the use of the stylus. Personally, I wish this damn thing would come out! I really want a tablet with a stylus!
Click to expand...
Click to collapse
seen it all just hate waiting for it lol, was rumours that it was going to be released on the 16th of October but obviously not :/, any one had anymore info about this ??

Categories

Resources