anandtech touches on thermal throttling on N10 - Nexus 10 General

Anandtech talks about power efficiency of new generation chips and mentions how the nexus 10 gets throttled down in high stress graph8c situations. Heres the specific page, at the bottom. And the article really shows just how much the cortex consumes in power, much, much more than other chipsets.
http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown/13

One of the most misleading articles I've ever read on Anandtech, that. It's full of interesting info, but ultimately there are few conclusions you can really draw other than that the 5250 has a very high TDP!
A lot of graphs show total power consumption when running a given benchmark/task, and then use this data to make assumptions on architecture/chipset performance. Even ignoring the "total device power draw" graphs (the N10 screen will suck MUCH more power than the crappy 1366x768 panels in the other tablets tested) and sticking purely to the CPU/GPU power draw comparison graphs, it must be considered that these devices are running a COMPLETELY different software stack!
This is like drawing comparisons on tyre grip when tyre A has been tested on tarmac, fitted to a 2 ton Bentley, with ambient temps of 40C and tyre B on snow, fitted to a 500kg caterham and in -20C ambient: There is simply too much different to even try and perform any kind of comparison between them. All you can do is look at the test results in isolation.

Agreed total power is way affected by the N10's screen, but at least it gives people an answer as to why they are getting slow down in games like NFS:MW

stevessvt said:
Agreed total power is way affected by the N10's screen, but at least it gives people an answer as to why they are getting slow down in games like NFS:MW
Click to expand...
Click to collapse
Except for the people who aren't getting any slowdowns in NFS:MW on the N10, myself being one of them.
So, no, it doesn't provide a conclusive answer for that, either.
What it does is provide another data point

ZanshinG1 said:
Except for the people who aren't getting any slowdowns in NFS:MW on the N10, myself being one of them
Click to expand...
Click to collapse
I'd have to see it to believe it at this point. Can you actually distinguish when FPS changes occur (no offense or anything like that; I know someone who claims a game ran "smoothly" to them, and I can see framerate jumping all over the place, and not even being that high to start with)?
Perhaps you have a decent camera (60 FPS recording preferred) where you can show proof of such? And also are you using a custom kernel or ROM?

I've noticed that the ambient temperature in the room influences thermal throttling. If I'm sitting in a room with a jacket on and it's 65F/18C then I don't have throttling issues like when I'm sitting near the fireplace and the ambient temperature is around 80F/27C. Maybe that's obvious but just bringing it up as a possible reason why some people may not see throttling during hard gaming. I definitely see throttling playing Critical Strike Portable (Multiplayer online), and I don't remember seeing that on the N7. I still use the N10 for gaming though because the screen is so nice, I just cringe every time I see some throttling.

I changed my thermal throttle limit to 90, had no problems so far
Sent from my Nexus 4 using Tapatalk 2

Pretty good write up..
One thing I find really interesting were those insanely low GPU consumption numbers by the N10 during the sun spider, kraken, etc test. The article didn't mention it (surprised), but there's two pieces of tech in the Exynos5 that are somewhat related to that:
PSR mode may be showing it's face in browser benchmarks, which cuts a lot of power when the screen is on a static image.
And OpenCL support. Which doesn't look like it's being utilized here, as GPU power consumption would probably be higher, but should bring total power consumption down by using the GPU cores to help out in task processing, similar to CUDA. I'd love to see this implemented since our SoC supports it.

Related

SGSII can take more FPS at 720p, like 60??

Some phones can record slowmo at vga resolution ( usually arround 200fps )
So... SGSII with the super power, i think can do 60fps at 720p? or at WVGA?
Why i'm sayng this? ok if you go to any party, race event, or something similar, isn't the same at 30 fps than at 60fps ( more fluid )
So... its posible?
thanks and forgive my bad english
Regarding wikipedia, it should be possible...
Generally: The less the resolution, the higher the fps (found it in german Wikipedia)
But i guess you'll need an application to do that.
Well, it a bit more complicated than just less resolution means u can have more frames. You have to consider things like cmos line read spead and shutter speed. Then if some1 did an app, it would suffer heavy from rolling shutter, does now, high frame rate would make it worse. I may look at media profiles cause it would be good to see if it can
Sent from my GT-I9100 using XDA Premium App
Sounds like a good idea. If someone can do this say @ 60fps, maybe we can check if there's some difference. If not much difference, then we can stick to what we have now.
rd_nest said:
Sounds like a good idea. If someone can do this say @ 60fps, maybe we can check if there's some difference. If not much difference, then we can stick to what we have now.
Click to expand...
Click to collapse
My cousin have a digital HI-Performance camera, and records at 720 -30/60FPS and 1080p at 30FPS...
And i can say, there is a noticeable diference at 30 than 60 FPS, the video is more fluid, and you can apreciate more details when the camera is in movement, like in a car race.
Soon i will get my SGSII, after reconsidering to get O2X.
I for one would like this too, especially if you could go further down the scale too, something like this;
1080p30 (Default)
720p60
480p120
240p240
Naturally, filming at something ridiculous like 320x240 (yes, yes, it's not 16:9) at 240FPS may not be useful to many, it would still provide some great slow-motion shots
I also fully expect the device to be not capable of shooting at that many frames per second anyway, 120 is pushing it
Like i said previously depends on shutter speed!
The shutter speed on old cameras used to be the amount of time the shutter exposed the film to light, on new DSLR or (Single Lens Reflex) it's the amount of time the mirror locks up and exposes the sensor, for video on a cmos sensor it's the speed at which the sensor is scan-line read. So to do 120fps, it would be not possible if it takes 1/30th of a second for the sensor to be read.
After looking at this a LITTLE, I think most of the crap capability is embedded in the camera firmware. It's possible to change media profiles but they do nothing.
Could be wrong and would like to do an app that can record 60 or 120 or even more, but after looking at this, would need a better dev than I and I would GUESS even then that it isn't likely due to hardware/camera firmware.
i'm happy with my camera performance
__________________
Device: Galaxy S II
ROM: Lite'ning Rom v1.4 - overclocked to 1.4GHz
Kernel: CF-Root v.3.8 XWKF1
Previous Phone: Galaxy S
so... for example:
if the SGSII have another 3rd CPU ( secret disabled CPU ) you will not unlock it?
It's working good so isnt necessary...
that's ridiculous...
tomeu0000 said:
so... for example:
if the SGSII have another 3rd CPU ( secret disabled CPU ) you will not unlock it?
It's working good so isnt necessary...
that's ridiculous...
Click to expand...
Click to collapse
I didn't make the comment that your referring to, but I do think your analogy is a little off. You kinda says if you had a audi and a ferrari, wouldn't you want to use the ferrari?
But it's actually more like, well we have a ferrari now lets try and tune the hell outta it (with a risk of damage) to get more out of it. Some people just wouldn't want to do that lol.
And yes, a cmos sensor creates heat, and I suspect making it read faster creates more heat. So yeah, possible damage
Personally I still want to look into this when I have some more free time though
deanwray said:
I didn't make the comment that your referring to, but I do think your analogy is a little off. You kinda says if you had a audi and a ferrari, wouldn't you want to use the ferrari?
But it's actually more like, well we have a ferrari now lets try and tune the hell outta it (with a risk of damage) to get more out of it. Some people just wouldn't want to do that lol.
And yes, a cmos sensor creates heat, and I suspect making it read faster creates more heat. So yeah, possible damage
Personally I still want to look into this when I have some more free time though
Click to expand...
Click to collapse
Some phones on the market support a loot of features, but that, are disabled by default...
Like the motorola defy, have a FM transsmiter but isn't enabled...
Like the Defy, now have 720P recording, not default WVGA, that isn't dangerous for the phone...
is more dangerous to overclock the CPU, than modify system files... ( ever if u know what are you modifyng )
About 3 years ago I used to have the Samsung INNOV8 (i8510, running on symbian), first one with the wide-angle 8mpx camera ( which i suspect remained exactly the same throughout the chain of models) and there was an integrated option in the camera sw for 120fps video, low-res ofcourse ( TI OMAP 2430 with a 330Mhz CPU )
So if they, indeed, use the same camera unit does that mean it allows that frame capture rate gibberish i can barely understand?
tomeu0000 said:
Some phones on the market support a loot of features, but that, are disabled by default...
Like the motorola defy, have a FM transsmiter but isn't enabled...
Like the Defy, now have 720P recording, not default WVGA, that isn't dangerous for the phone...
is more dangerous to overclock the CPU, than modify system files... ( ever if u know what are you modifyng )
Click to expand...
Click to collapse
again were not talking about an extra feature, we are talking about cranking a feature up to beyond what it is at the moment. And YES there are potential risk factors in altering read speeds of a cmos sensor, I know many cmos sensors that have burned due to normal use (manufacturing error, heat) and cranking them up increases the chance of damage.
I think you missunderstand what would need to happen, altering system files is easy, but what is actually going to happen in a hardware sense is that instead of the sensor being read 30 times every second, it would be read 120 times every second. Now, that, due to the increase in electical flow would create more heat, increaseing heat is not always a safe thing to do in electrical components.
Now I'm not saying it cant be done, or that I'm posative that it will damage my phone, I only mean to say that calling a poster ridiculous cause they don't want to take the chance, while citing a dissacosiated anology is perhaps a little wrong.
Also to say that overclocking the CPU is more dangerous is a little off unless you have info on the cmos that I cant find? As if it's rated at 30reads per second and 35 reads at 1 minute would blow it, then no, it's not safer. It's almost EXACTLY like overclocking your cpu
Neways, I'm looking to get some info on the cmos and camera info inside the sgs2 but tis hard to find.
bahkata said:
About 3 years ago I used to have the Samsung INNOV8 (i8510, running on symbian), first one with the wide-angle 8mpx camera ( which i suspect remained exactly the same throughout the chain of models) and there was an integrated option in the camera sw for 120fps video, low-res ofcourse ( TI OMAP 2430 with a 330Mhz CPU )
So if they, indeed, use the same camera unit does that mean it allows that frame capture rate gibberish i can barely understand?
Click to expand...
Click to collapse
Would be good if it was, but 3 years? Hmmm
I need to find info on the camera hardware I think
Those who are "content" are on the wrong forums
We're here to push boundaries, find new frontiers.
You're worried that using the camera at 120FPS is going to melt the device? I severely doubt that, you can overclock the I9100 from 1.2 to 1.5GHz and it gets WARM for example, but not hot enough to cause damage.
Accessing a camera at a faster rate won't generate heat, at the end of the day you're just reading the values across the CMOS sensor, it's not having to do complex mathematical calculations so it won't generate much heat, if any at all.
foxdie said:
Those who are "content" are on the wrong forums
We're here to push boundaries, find new frontiers.
You're worried that using the camera at 120FPS is going to melt the device? I severely doubt that, you can overclock the I9100 from 1.2 to 1.5GHz and it gets WARM for example, but not hot enough to cause damage.
Accessing a camera at a faster rate won't generate heat, at the end of the day you're just reading the values across the CMOS sensor, it's not having to do complex mathematical calculations so it won't generate much heat, if any at all.
Click to expand...
Click to collapse
Reading CMOS = Electricity = heat Same with any CMOS (less so with CCD)
not sure if this post is still alive, but.
i was looking to do the same , and today i saw SiyahKernel 2.2 beta 6.
http://www.gokhanmoral.com/gm/
which removed 30fps limit.
"increased the fps limit in the camera driver (30 to 120). I hope that the one who sent me a PM about this modification can manage to use it to have better image or video quality."
so technically the hardware will not stop you now to go to 30fps+ in videos.
i tried Lgcamera/lgcamcor from market, since it allows you to select the FPS in the video recording setting (selected 60), BUT it didn't record at that FPS.
i'm guessing that the camera settings will be phone specific.
just wanted to share this , since i'll keep trying/asking around , thought the ppl on this thread might also have some experience in this
Agreed, 60fps would be great idea, as SGS2 has a powerful camcorder for making movies. PLS, anyone have any idea about that??!?!
A bit off topic but download fast burst camera from 4shared, amazingly fast!
Sent from my GT-I9100 using xda premium
fasburst cameras are a bit pointless imho, better to record video... cause all of these fast burst things all they do is save the on screen preview buffer....

how about incredible s i want get 1

just say sth what may help me。。。thx
Okay, you can use www.gsmarena.com for comparing phones but I'll break it down to you.
It would help me if you wrote for what you want to use the phone but let's say you will just use it for regular stuff.
The screen is fair, 4 inch is not too big neither too small. Camera is pretty good but edges of picture are bit too artificially sharpened
which leads to lower quality but it's a common thing with much phones, same thing with video recording. The CPU and GPU are really good.
I have a phone with the same CPU and GPU and it runs graphically intensive games almost without any drops. 768mb RAM is okay but could be better.
So... This phone is okay for casual gaming and just everyday use. It isn't something super special but in the end it all depends on the price.
If you find it slow, you can always put a ROM on it, that will speed things up for sure!
Hope I helped

[HELP][SK17i] Buying the phone.

Hello,
I'm currently a proud owner of HTC Tattoo (running android 2.3.7) and I'm looking to buy a new phone. There are few phones that I had in mind, but mini pro is currently my favorite. I love the fact that it has a real qwerty keyboard, 1Ghz processor, and the small screen (I like a phone that fits in my pocket).
But, I have few questions:
Do the newer versions have the noise issue ?
Is the touch screen responsive ?
Is battery life good, and is battery overheating ?
Is reception good ?
Is the phone generally fast ? (I'm tired of Tattoo's occasional lags)
So what do you think, should I buy it ?
Kind regards,
Andrew
Some have reported that they still have the electrostatic noise even after updating to 2.3.4 although Sony Ericsson have stated that it's fixed. Of course, as an owner of a Sony Ericsson Xperia mini Pro sk17a, after finally updating, I still get the electrostatic noise.
I've seen much improvement with the touchscreen after updating to 2.3.4. As in its pretty responsive.
In terms of battery life, it's decent, I can get several hours of movie watching on it (like 10 hours) and it still hasn't reached 15%. Charging is really fast, you can probably charge the phone dead to full in an hour. The battery won't overheat unless you're playing games on the beach. I haven't seen the phone go above 42C.
Reception depends on location, carrier, etc, factors like that. I'm getting full bars right now so I guess it's good.
The phone is generally fast, but you will see a difference comparing to your old phone or to a high end phone like the SGS2. I haven't seen much occasional lag. And those were very occasional.
Do you like the 3-inch screen? The thick form factor?
Whether you should buy it or not, it's your choice. It's a pretty good mid- ranged phone.
Feel free to ask more questions.
AndrewB. said:
Do the newer versions have the noise issue ?
Is the touch screen responsive ?
Is battery life good, and is battery overheating ?
Is reception good ?
Is the phone generally fast ? (I'm tired of Tattoo's occasional lags)
Click to expand...
Click to collapse
"Newer" phones have had a hardware fix - according to Sony Ericsson. The jury is out on that one though as there's no confirmation of the build number's that have the fix and there's no real agreement on whether it has "gone" or merely been reduced to a lower level. In fact, there appears to be at least two different noise problems - a rythmic pulsing and a higher-pitched clicking noise - so it's often difficult to know which people are talking about when their posts aren't detailed.
Touch screen - excellent. I've had no problems with it - even with a screen protector attached.
Battery life - difficult question. If used as a phone, it will last for ages. If you use wi-fi and other heavy draws, it will obviously reduce battery life. Even then, it's very difficult to make a fair comparison with any other phone - you'd have to have a genuine like-for-like comparison - even the signal strength and line speed from the network source will impact on how much work the phone has to do. As a purely subjective opinion - it's acceptable. I often leave wi-fi on all day and play games, watch videos and generally "play" with the phone for a full day - I just recharge it at night - and it's never run flat. Regarding heat - I've never noticed it even getting more than vaguely warm.
Reception - an odd one. It does seem well able to pick-up on the weakish signal where I live but it has a strange habit of losing the connection even when there is a string signal. It doesn't do that all the time and it appears to me to be a software rather than a hardware problem as it seems to happen after using certain apps or performing certain functions. Having said that, I've yet to lose a call.
Is it fast - hell yes. A 1GB single-core processor is hardly going to break any records but the phone "feels" great - I've yet to find anything that suffered any obvious issues as a result of the CPU speed. Considering the price - which has recently halved in the UK - it is miles faster and more responsive than anything else I could find. Video playback is smooth and seeking within vids is instantaneous, webpages are rendered very quickly - including graphics and - of course - the phone supports Flash properly so that includes animated graphics. The games I've played have been smooth and I've not noticed any feeling that the phone is holding me or the game back. There is an occasional, very slight, stammer if playing a game which has banner-adverts popping up but I've seen that happen on phones with faster CPUs - and switching off network connections (or rooting and installing an ad blocker) stops that anyway.
Overall - you can't buy £500's worth of performance for £130 - but I reckon this phone delivers twice what it cost. If you need top-of-the-range performance, you have to pay top dollar - but if you are looking at a phone in this price range, I don't think there are currently any serious alternatives.
One thing - the small screen is something you have to take into account - and that is entirely down to what you expect to use the phone for. Bear in mind that although the screen is small in size, the resolution is actually pretty good compared with most phones at this price. It's not ideal if you want to do a lot of reading or browsing - text is lovely and clear but no screen this size can compete with a larger screen where you can see more than a paragraph at a time. For general browsing though - checking the news and reading through forums etc - it's perfectly good.
It's worth mentioning too that the keyboard is great. The keys look tiny but there's plenty of space between them so accurate typing is pretty easy.
There's one big downer for me with this phone and even though it's something you didn't ask about (and may not matter to you), it deserves a mention. The camera - still and video - is awful. The still camera produces lifeless, soft pictures that look like they came from a sub-1M camera from ten years ago. The supposedly "HD" video camera is just about acceptable in the brightest of natural light - otherwise it is soft, grainy and about as "good" as the average 1.3M webcam - only the pixel size of the output film is high - as if it filmed at 1M and then stretched the film in software. This is one of the most common criticisms of this phone and it is hoped that SE can fix these problems as they appear to be down to the software over compressing the photos and vids to reduce file-sizes. A "quality" setting for the camera (often found on higher resolution cameras) would be the obvious solution - and should be easy to add.
And then there's the BIG upper for a lot of people - SE are already working on getting Android 4 - Ice Cream Sandwich - rolled out for this phone. It is hoped that we'll start seeing that within a few weeks. If nothing else, that will protect the resale value of the phone - you should be able to get a good price if you sell the phone in a year's time because it will still be fairly up-to-date - the same doesn't apply to many other phones in this price-range.
Hope that helps
Thank you for helping me decide.
I bought the phone and I love it.
AndrewB. said:
Thank you for helping me decide.
I bought the phone and I love it.
Click to expand...
Click to collapse
congrats man! =D

New high resolution Prime perfomance (the Google+ article by Dianne Hackborn)

Hi all,
I know this article has been floating around here for some time, but this I found rather interesting:
Some have raised points along the lines of Samsung Galaxy S2 phones already having a smoother UI and indicating that they are doing something different vs. the Galaxy Nexus. When comparing individual devices though you really need to look at all of the factors. For example, the S2's screen is 480x800 vs. the Galaxy Nexus at 720x1280. If the Nexus S could already do 60fps for simple UIs on its 480x800, the CPU in the S2's is even better off.
The real important difference between these two screens is just that the Galaxy Nexus has 2.4x as many pixels that need to be drawn as the S2. This means that to achieve the same efficiency at drawing the screen, you need a CPU that can run a single core at 2.4x the speed (and rendering a UI for a single app is essentially not parallelizable, so multiple cores isn't going to save you).
This is where hardware accelerated rendering really becomes important: as the number of pixels goes up, GPUs can generally scale much better to handle them, since they are more specialized at their task. In fact this was the primary incentive for implementing hardware accelerated drawing in Android -- at 720x1280 we are well beyond the point where current ARM CPUs can provide 60fps. (And this is a reason to be careful about making comparisons between the Galaxy Nexus and other devices like the S2 -- if you are running third party apps, there is a good chance today that the app is not enabling hardware acceleration, so your comparison is doing CPU rendering on the Galaxy Nexus which means you almost certainly aren't going to get 60fps out of it, because it needs to hit 2.4x as many pixels as the S2 does.)
To be complete, there is another big advantage that the GPU gives you -- many more drawing effects become feasible. For example, if you are drawing a bitmap in software, you basically can't do anything to it except apply an offset. Just trying to scale it is going to make rendering significantly slower. On a GPU, applying transformations well beyond simple scales is basically free. This is why in the new default Holo themes in Android we have background images -- with hardware accelerated drawing, we can afford to draw (and scale) them. In fact, if the hardware path is not enabled by the app, these background images will be turned off.
Click to expand...
Click to collapse
This is kinda the same as with the Prime and the T700/other high-resolution tablets, isn't it? I'm not sure, but it sounds pretty obviously since the Tegra3 GPU isn't very good (yes, it is fine but I'm not sure for those high-res screens?). However I could be completely wrong..
I agree. It's the same with a gaming computer. Just because ur monitor has 1080p doesn't mean u can play all games in that rez. U will need a much more powerful gpu. I am certain though the tegra3 can support 1080p but it won't be smooth as 720p like our device. Unless u lower the rez but how would u on an android. Furthermore how ugly games would look who aren't optimize for 1080p.
Nvidia always!
The question isn't whether there's going to be a performance hit, it's what the performance hit looks like. If it's invisible in everything but gaming, I'd bet a lot of people will go for the HD display and gamers will stick to the lower res. If it's obvious in UI performance and transitions, it makes the benefit of the HD screen a little more questionable. The new chip in the iPad3 and Samsung's new Exynos chip won't make you choose (on paper). Benchmarks are useless except for bragging rights.
I have been saying this since people were trying to compare the new acer and samsung back in Dec. The higher the resolution, the more power and resources it takes. Also you have to look at the app market right now. What app's are out that will use that 1080p display...NONE as of now. Once they (1080p tablets) are released, it will be a few months before most apps will adapt to the new higher displays.
I continue to question the need for having a 1080p 10 inch display- there has to be a limit as to high a ppi count the human eye can reasonably distinguish. Just bumping up the resolution while not working on improving the true render process (in case of games or animations) does not make any sense to me.
A retina display just for the heck of it is not a great idea, at least to me.
For what it's worth, ICS is supposed to be fully hardware accelerated, so the Tegra 3 could be enough to power the higher resolution for everything but games.
Anandtech (who I probably trust the most when it comes to hardware evaluations) seemed to suggest in an early preview that the higher resolution *may* perform ok:
http://www.anandtech.com/show/5348/...-with-asus-1920-x-1200-tablet-running-ics-403
That said, there are still questions as to the benefit of such a high resolution on a 10" form factor designed to be held only 1-2' away from your face. They didn't bump up to 1920 x 1200 resolution monitors until 24" LCDs and up.
The real issue is that games on Android don't let you pick a resolution for them to run at. Almost all run at the full Res of the screen, which means slideshow on a 1080p Prime.
avinash60 said:
I continue to question the need for having a 1080p 10 inch display- there has to be a limit as to high a ppi count the human eye can reasonably distinguish. Just bumping up the resolution while not working on improving the true render process (in case of games or animations) does not make any sense to me.
A retina display just for the heck of it is not a great idea, at least to me.
Click to expand...
Click to collapse
I agree, there is just no point..... there is more important things to improve than pixel count....
Thanks, at least I am not alone on this idea. It seems like when the news came that the iPad 3 is going to have a retina desiplay all the manufacturers didn't care anymore and just were thinking "We also need that!". I am comparision the text from thread with my HTC Sensation which should have a better DPI:
Transformer Prime: 149
The new Prime: 218
HTC Sensation: 260
and from NORMAL viewing distance both look great. However, when i come closer the pixels on the Transformer Prime are a little visible where the Sensation stays sharp. However the phone has a better DPI then the new res. panel so I'm not sure how that is.
I'm sure it will look some better, but I am not sure if it is worth the wait (again) and also the possibilty of the new Prime itself can't keep up with its own resolution..
Oh, again not trying to defend the Prime here.. I have to return it anyway because of backlight bleeding and am not sure if I want a new one or my money back, however if I see this result I think the resolution is just pure marketing.. I mean who is going to sit with its prime 5 cm from their heads.. lol.
http://androidandme.com/2012/01/news/hands-on-with-the-acer-iconia-tab-a510-and-zte-7-tablets/
Watch the video on Acer Iconia a510 (unannounced tablet). 1080p that comes with this tablet... does look a bit sluggish.
Just to add my galaxy nexus is 316 dpi..... unless your 2in from the screen...there really isn't much difference.
Also, I love how laptop and desktop DPI is half what most phone/tabs are and people are having a fit......
http://en.wikipedia.org/wiki/List_of_displays_by_pixel_density#ASUS
Seems to run pretty good since it is still a pre-production model, however not as smooth as the Prime with ICS yes..
Danny80y said:
Just to add my galaxy nexus is 316 dpi..... unless your 2in from the screen...there really isn't much difference.
Also, I love how laptop and desktop DPI is half what most phone/tabs are and people are having a fit......
http://en.wikipedia.org/wiki/List_of_displays_by_pixel_density#ASUS
Click to expand...
Click to collapse
Yeah, exactly what I mean.. you can see it if your very close to the screen, but why would you do that, lol.
Oh, btw.. for the iPad 1&2 it still is 132, which is much lower then our Transformers (149,5), never heard real complaints about that.
>What app's are out that will use that 1080p display...NONE as of now
eBooks & PDFs. Sharper texts. More texts. One can conceivably view 2 pages side-by-side (16:10 / 2 = 8:10, or close to the 8.5:11 printed page).
With display mirroring, you get 1:1 pixel ratio when plugged into a HDTV via HDMI. This makes above use-case (high-density text consumption) much more feasible. Ditto for remote access.
Gaming perf will take a hit. Then again, gaming isn't exactly an Android forte right now, or for mobiles in general. The bulk of games are casual stuff, geared for handset resolution.
One can argue that hardcore Android gaming will prosper over time, and FPS perf will matter more. There are problems with this line of thought. First, is simply the assumption that Android will prosper on tablets, which given current sales is hardly a forgone conclusion. Second, are the fast advances in hardware and their correspondingly short lifespan. GPU-wise, the Teg3 isn't the fastest even now. By the time we get to see enough hardcore games, we'd be on Teg 5 or 6, or their equivalent. Teg3 will be old news.
But sure, if shooters and frame count are your thing, then 720p sounds like a plan, at least for the Teg3.
>I continue to question the need for having a 1080p 10 inch display
Some don't see the need for GPS in tabs either. Some don't use the cams. Different people have different uses. You shouldn't generalize your use to be everyone else's.
Rest assured that when it comes to marketing, toys with lo-res display will be viewed as inferior. Bigger is better. It's the same thing with quadcore vs dualcore vs single-core. Do you actually need a quadcore?
>there has to be a limit as to high a ppi count the human eye can reasonably distinguish
This argument has been bouncing around ever since Apple's Retina Display. Per this PPI calculator, 1920x1200 is 224ppi on a 10.1". Reportedly, people can discern 300ppi at 12" distance, given 20/20 vision. The real test is simpler and much less theoretical: walk into a store and compare the TF201 and TF700 side-by-side, and see if you can discern the difference.
>Anandtech (who I probably trust the most when it comes to hardware evaluations) seemed to suggest in an early preview that the higher resolution *may* perform ok:
Anandtech is good for chip-level analysis. For (mobile) system hardware and use-case analysis, he's just as green as many other tech blogs. Note the gaffs on the Prime testing wrt GPS and BT/wifi coexistence. I do see signs of improvement, however. They came out with a new Mobile Benchmark suite, whatever that means.
>The real issue is that games on Android don't let you pick a resolution for them to run at.
The real issue is that Android is still a nascent OS for tablets. HC was a beta which never took off. ICS was just released. The bulk of Android apps & games are still for handsets.
I have been concerned about this as well. Tegra 3's GPU is fine enough for a 1200x800 tablet, but it's going to be stretched at 1080p (this is nearly the resolution that my desktop runs at!).
I'd love a higher-resolution display, but it's a luxury (well, a tablet itself kinda is already, but even more so). It's not as if 1280x800 is cramped and blocky. I'm happy to wait a bit longer for 1080p tablets to mature and come down in price.
(I'd rather have 2GB RAM, actually.)
Well, perhaps this new release will coincide with a bump in the specs of Tegra 3. By the time the new tablet comes out, I would assume that's been almost half a year.... That's usually about the time span that nvidia would come out with a refresh of a chip design (well, they do this with their desktop GPUs, so not a great comparison, but it's possible?). So in the end perhaps the question of performance will be moot because there will be a faster Tegra 3 and more RAM in the new higher resolution tablets.
Just a thought.
Don't underestimate.
Let's wait a review or test.
Probably the Tegra 3 is more than capable of handling this kind of resolution in terms of playing HD movie, high profile compression, etc.
I saw several tests on current prime, and it has no problem with HD videos.
My only concern is battery life ... that's all.
I expect the 1920x1200 will result worse battery life, unless ASUS pump up the battery capacity or any other improvement.
JoeyLe said:
Hi all,
I know this article has been floating around here for some time, but this I found rather interesting:
This is kinda the same as with the Prime and the T700/other high-resolution tablets, isn't it? I'm not sure, but it sounds pretty obviously since the Tegra3 GPU isn't very good (yes, it is fine but I'm not sure for those high-res screens?). However I could be completely wrong..
Click to expand...
Click to collapse
gogol said:
Don't underestimate.
Let's wait a review or test.
Probably the Tegra 3 is more than capable of handling this kind of resolution in terms of playing HD movie, high profile compression, etc.
I saw several tests on current prime, and it has no problem with HD videos.
My only concern is battery life ... that's all.
I expect the 1920x1200 will result worse battery life, unless ASUS pump up the battery capacity or any other improvement.
Click to expand...
Click to collapse
Asus has already stated that battery life will be pretty much the same as the current Prime...So that should equal shorter battery life.I'll stick with my Prime for now.No Need in buying another tablet right now IMO.I'm waiting to see what Samsung brings to the table.
hyunsyng said:
Well, perhaps this new release will coincide with a bump in the specs of Tegra 3. By the time the new tablet comes out, I would assume that's been almost half a year.... That's usually about the time span that nvidia would come out with a refresh of a chip design (well, they do this with their desktop GPUs, so not a great comparison, but it's possible?). So in the end perhaps the question of performance will be moot because there will be a faster Tegra 3 and more RAM in the new higher resolution tablets.
Just a thought.
Click to expand...
Click to collapse
I don't think they can bump the specs within the generation of a chip. The only thing that can happen till then is that Asus finds an economical way to add 2GB memory to the device, Nvidia improves the production capabilities of Tegra 3 and we get a better yield of the chips. The spec increase can only happen from one generation to the next.
I think the performance will be fine. Even the battery life.
Most of the battery usage screen-wise is from the backlight, which will be the same.
Also, not much more power may be used necessarily either, especially if it doesn't end up taxing the Tegra 3 as much as we think it will. As far as we know, our 1200x800 displays may not even be taxing the Tegra 3 that much. If anything, the article shows that the Tegra 3 may be more qualified to handle that high a resolution with little to no performance degradation. There are demos on youtube of a tegra 3 device playing 1440p movies just fine, all while driving a second screen at the same time.
Of course I too don't feel the need for something that high of a resolution on a 10 inch screen, but I'll never really know until I see one in person.

Kirin 980 pefrormance issues?

It's not a clickbait,saw some early antutu scores and noticed one interesting thing. Theres Normal mode and Performace(cheat?!) mode.
Phone is ofc designed to be used in normal mode in 24/7 use,since performace produces more heat and raises power consumption beyond level for sustainable use.
In normal mode phone seriously underperforms in benchmarks,for example gets 270k while SD845 phones are getting 290k+. Performance mode bumps Kirin's scores to 310k+.
Other area where phone underperforms is GPU section under antutu. Gets bout 9k in normal mode and 10k in performance mode while Ardeno 630 powered phones score 12k w ease.
I know these benchmarks don't mean much in general use but in phone gaming era,you can actually feel that difference(having owned both Mate 10 Pro and OP6).
I think history repeated itself and we got SoC that will barely match last generation of SD and be totally obliterated by next.
Guess we'll know more when Anandtech makes in depth review.
troublecro said:
It's not a clickbait,saw some early antutu scores and noticed one interesting thing. Theres Normal mode and Performace(cheat?!) mode.
Phone is ofc designed to be used in normal mode in 24/7 use,since performace produces more heat and raises power consumption beyond level for sustainable use.
In normal mode phone seriously underperforms in benchmarks,for example gets 270k while SD845 phones are getting 290k+. Performance mode bumps Kirin's scores to 310k+.
Other area where phone underperforms is GPU section under antutu. Gets bout 9k in normal mode and 10k in performance mode while Ardeno 630 powered phones score 12k w ease.
I know these benchmarks don't mean much in general use but in phone gaming era,you can actually feel that difference(having owned both Mate 10 Pro and OP6).
I think history repeated itself and we got SoC that will barely match last generation of SD and be totally obliterated by next.
Guess we'll know more when Anandtech makes in depth review.
Click to expand...
Click to collapse
I think you're right. Next snapdragon gpu will kick its behind! Cause its not much better gpu wise than the 845 right now, if at all?
But the gpu's these days are overkill for most things and I'm loving the look of the mate 20 X. It'll be good enough
No actually it has a worse GPU than 845.
And I wouldn't call all GPUs of this era an overkill taken into consideration levels of graphics we're getting in new games.
I know that most of ppl don't game on their phones but numbers of those who do are rapidly growing and mobile gaming market is becoming larger and larger by every day.
troublecro said:
No actually it has a worse GPU than 845.
And I wouldn't call all GPUs of this era an overkill taken into consideration levels of graphics we're getting in new games.
Click to expand...
Click to collapse
Overkill for most games? No? And I'm hoping it's on par at least with 845 which handles everything I throw at it really easily.
The Mali G76MP10 is actually slower then the Adreno 630 but at least, it should be quite power efficient. Just read the Kirin 980 article from Anandtech.
Im waiting for anandtech review of Kirin not previews. Not interested in pre release benchmarks but real one, along power consumption and throttling.
Kirin 970 also should have been power efficient but it turned quite opposite.
The GPU is probably fine, but not able to compete with SD845 under normal speeds. So Hisilicon pushed it beyond limits to get the edge over SD845. Kind of like how Vega 64 was pushed beyond limits to get faster than the GTX 1080.
Like before on the Kirin 960 and 970, the max GPU speed is impressive, but not at all sustainable.

Categories

Resources