Hi, what processor recommended? - PC part sales

I wanted to make a custom PC build but I wanted to ask you what processor recommended. My search on wallapop is Intel Core i5, but which version recommended? Link to my search

@CzechosDrama,
When looking for anything you want to purchase, build, repair, install, or uninstall. Always do research yourself as that will get you the best results. Luckily, Google has all the information you need for free to do this research, not to mention pricing and availability. Google is your friend.
Thank you!
thisguy12win
Jason Tollakson

If I was to do a PC build though. In regards to a processor I would go with the Intel i7. It's really fast and reliable.
Thank you!
thisguy12win

But you are building a pc for office or a gaming pc? this is very relevant

thisguy12win said:
If I was to do a PC build though. In regards to a processor I would go with the Intel i7. It's really fast and reliable.
Thank you!
thisguy12win
Click to expand...
Click to collapse
Thanks for your recomendation, gotta check on wallapop to see a low cost and good i7 proccesor.

CzechosDrama said:
I wanted to make a custom PC build but I wanted to ask you what processor recommended. My search on wallapop is Intel Core i5, but which version recommended? Link to my search
Click to expand...
Click to collapse
If it's not too late, here's some important things to know:
1. The biggest mistake people make is that they always classify processors by i3, 5, 7 and 9. Don't do this. Each processor is different based on its generation. For example, the 12th gen October 2021) i5-12600K is much faster than the 11th gen i9-11900K (March 2021), even though the number is lower.
2. There are always two types of i5 CPUs in each generation:
- i5-#400
- i5-#600
Where # is the generation number.
i5 600 CPUs are better than i5 400 CPUs but are more expensive.
3. Intel has different letters at the end of some CPUs, the main ones to know are:
K - This means that the chip is unlocked, meaning you can overclock it without voiding warranty. For any processor that is 12th gen or newer, it also means they get extra cores.
F - No integrated graphics. If you get a CPU ending with F (e.g: 12400F, 11600KF), your PC will not work unless you brought a separate graphics card (which you probably will). As a return, F CPUs are cheaper than the non-F versions.
On top of all that, i7 and i9 are still options depending on the generation.
It all depends on your budget and what you need from your PC.
I know that seemed way too complicated, so let me help you pick a CPU. Just tell me your budget and what you want your PC for.

A3RNAV said:
If it's not too late, here's some important things to know:
1. The biggest mistake people make is that they always classify processors by i3, 5, 7 and 9. Don't do this. Each processor is different based on its generation. For example, the 12th gen October 2021) i5-12600K is much faster than the 11th gen i9-11900K (March 2021), even though the number is lower.
2. There are always two types of i5 CPUs in each generation:
- i5-#400
- i5-#600
Where # is the generation number.
i5 600 CPUs are better than i5 400 CPUs but are more expensive.
3. Intel has different letters at the end of some CPUs, the main ones to know are:
K - This means that the chip is unlocked, meaning you can overclock it without voiding warranty. For any processor that is 12th gen or newer, it also means they get extra cores.
F - No integrated graphics. If you get a CPU ending with F (e.g: 12400F, 11600KF), your PC will not work unless you brought a separate graphics card (which you probably will). As a return, F CPUs are cheaper than the non-F versions.
On top of all that, i7 and i9 are still options depending on the generation.
It all depends on your budget and what you need from your PC.
I know that seemed way too complicated, so let me help you pick a CPU. Just tell me your budget and what you want your PC for.
Click to expand...
Click to collapse
My budget is 30€-80€ and I want my pc for gaming, surfing the internet and testing VMs (maybe).

What exactly is your full budget (like for everything)? I would personally recommend spending a bigger part of your budget towards your CPU. Ideally you would want a 6-core CPU, but the best thing I can find under 80 is an Intel i3-10100F from 2019 on sale.

CzechosDrama said:
I wanted to make a custom PC build but I wanted to ask you what processor recommended. My search on wallapop is Intel Core i5, but which version recommended? Link to my search
Click to expand...
Click to collapse
I would go with a AMD Ryzen CPU, they come with built in Vega graphics that perform really well in gaming, meaning, you won't have to buy a graphics card, if you go with an Intel CPU, you will have to buy a graphics card. AMD is a real cost saver because they are cheaper than Intel and come with excellent built in graphics.

That's actually true. The integrated graphics are really good, and you can easily get a 2nd or 3rd gen Ryzen 3 or 5 for a good price now. If you ever need to upgrade your GPU down the line the older RX series is also dropping in price

13th Gen i5 or i7 if you've got a broader budget. i9 isn't worth it the money.
Ryzen are decent, I've got one, but the Intel 13th Gen outperforms them this round.

Related

What do you know about the Tegra 3 SoC in the Asus Prime?

-The Tegra 3 SoC (System on a chip) is a combo of a microprocessor, a memory controller, an audio processor, a video encoder and a graphics renderer. It's designed and manufactured by Nvidia, world leader of graphics computing, making it's first appearance in the Asus Transformer Prime.
-The Tegra 3 SoC has 5 physical cores, but limited to performance of quad-cores. The 5th, lower power core, is activated only when the device is idle or handling low tasks, such as syncing and e-mail checking. So, power consumption is always kept to minimum when performance of the quad-core is not needed, ensuring longer battery life. Once you run a normal or higher-demanding task on the tablet, the 5th core shuts off automatically before the 4 main cores are activated. This is all the bios of the chip and doesn't require the user or the developer to change anything to use the Android OS and application this way. Android OS already has a the best support for multi-tasking and is multi-threaded friendly compared to competing operating systems in the market. So, this should be good news of the Asus Transformer Prime to-be users soon.
-The GPU (Graphics Processing Unit) in the Tegra 3 Soc has 12 shaders. But, because Nvidia has not followed a unified-shader architecture in this ARM SoC like they've been doing in their PC and MAC discrete graphics cards, 8 of those 12 shaders are reserved for pixel work and the remaining 4 are for vertex. Maybe Nvidia will use unified-shader architecture in the next generation Tegra SoC'es, when the ARM-based devices are ready for it. The PowerVR MP2 GPU in the iPad 2 has more raw power than the Tegra 3 GPU (Actually, it's the only one thing I personally like about the iPad 2, it's GPU!), but the Tegra 3 Geforce (the commercial name Nvidia uses for their gaming graphics processors) should give a solid 3D performance in games, especially the officially supported games. Nvidia has long history in 3D gaming and been using it's solid connections with game developers to bring higher quality gaming to Android, like what we've seen with Tegra 2 SoC capabilities in games listed in the TegraZone Android app. Add to that, games are not just GPU bound, Tegra 3's quad-cores and 1GB system RAM (iPad has 512MB) will pump up gaming qualities for sure and the pixel density of 149ppi displays crisper images than the 132ppi of the iPad 2. Once the Asus Prime is released, it can be officially considered the highest performing Android device in the world, especially 3D gaming.
Well, I thought I'd have more to type, I paused for a long time and could not think of anything to add. I only wanted to share few things I know about the Tegra 3. I have high interest in computer graphics/processors and been following the Tegra project since 2008.
Some of the Asus Prime to-be-owners doesn't know or care that much about technical details of the CPU in the device and I thought of sharing with them.
Thanks and gold luck.
Thanks for the info. Very interesting
As I understand it, the use of the lower power 5th core has decreased battery consumption by over 60% when compared to the earlier 2 core design. I am not sure how they are measuring consumption and the task load.
I am most exited about the tablet because of the tegra 3.
In smartphones I find the idea of putting more than one core quite rubbish.
It is not the best solution for a tablet or any other mobile device too. I would highly appreciate a well programmed software over overpowered hardware.
Yet the tegra has a nice concept.
I think most of the time I won't use more than that 5th core. I mean it is even powerful enough to play HD video.
I will primarily use apps that display text and images. Like the browser who is said to utilize 4 cores. But I am sure only because of the crappy programming.
So if people finally come to their minds and start optimizing their apps we will have one quite powerful core and 4 in backup for REAL needs. Seems like an investment in the future for me.
Sent from my Nexus One using XDA App
Straight from Wikipedia:
Tegra 3 (Kal-El) series
Processor: quad-core ARM Cortex-A9 MPCore, up to 1.4 GHz single-core mode and 1.3 GHz multi-core mode
12-Core Nvidia GPU with support for 3D stereo
Ultra low power GPU mode
40 nm process by TSMC
Video output up to 2560×1600
NEON vector instruction set
1080p MPEG-4 AVC/h.264 40 Mbps High-Profile, VC1-AP and DivX 5/6 video decode[18]
The Kal-El chip (CPU and GPU) is to be about 5 times faster than Tegra 2[19]
Estimated release date is now to be Q4 2011 for tablets and Q1 2012 for smartphones, after being set back from Nvidia's prior estimated release dates of Q2 2011,[20] then August 2011,[21] then October 2011[22]
The Tegra 3 is functionally a quad-core processor, but includes a fifth "companion" core. All cores are Cortex-A9s, but the companion core is manufactured with a special low power silicon process. This means it uses less power at low clock rates, but more at higher rates; hence it is limited to 500 MHz. There is also special logic to allow running state to be quickly transferred between the companion core and one of the normal cores. The goal is for a mobile phone or tablet to be able to power down all the normal cores and run on only the companion core, using comparatively little power, during standby mode or when otherwise using little CPU. According to Nvidia, this includes playing music or even video content.[23]
Tegra 3 officially released on November 9, 2011[/LEFT][/CENTER][/FONT]
Tegra 2's maximum ram limit was 1GB. Tegra 3's could be 2GB.
xTRICKYxx said:
Straight from Wikipedia:
Tegra 2's maximum ram limit was 1GB. Tegra 3's could be 2GB.
Click to expand...
Click to collapse
The rumor mill is churning out some specs on an upcoming Lenovo tablet with some funky specs, like 2GB DD3....so it's possible. However, the same leak/article also says its chip is clocked at 1.6 Ghz which is quite a bit out of spec, so I would take it with a usual:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
jerrykur said:
As I understand it, the use of the lower power 5th core has decreased battery consumption by over 60% when compared to the earlier 2 core design. I am not sure how they are measuring consumption and the task load.
Click to expand...
Click to collapse
You can read the White Papers on the Tegra 3 over on Nvidia's website. But the chip has a controller built into the chip that activates either the 4 cores, or the 1 core based on power demand of a given processing activity.
The quad vs single core are made out of different silicone materials, but same design structure in order to maximize the energy efficiency at the performance curve. The difference of Materials is more efficient at different power curves. So the 5th core is very efficient at low processing levels where it is actively being used.
It's pretty cool stuff
RussianMenace said:
The rumor mill is churning out some specs on an upcoming Lenovo tablet with some funky specs, like 2GB DD3....so it's possible. However, the same leak/article also says its chip is clocked at 1.6 Ghz which is quite a bit out of spec, so I would take it with a usual:
Click to expand...
Click to collapse
*Correction: Tegra 3 supports DDR2 AND DDR3. The original Transformer had 1GB of DDR2 @ 667Mhz. The Prime has 1GB of LPDDR2 @ 1066Mhz, a considerable bump in speed. Also, Tegra 3 supports up to DDR3 @ 1500Mhz!
xTRICKYxx said:
I think the only compatible RAM would be DDR2. Clock speeds don't matter, as the Tegra 3 can be OC'd to 2Ghz no problem.
Click to expand...
Click to collapse
I'm sure it can, hopefully they increase the battery capacity to compensate for increased power use. As far as the memory, Nvidia's site on Tegra 3 lists DDR3 (though its still running on a 32-bit bus which may or may not be an issue with 3d games), upto 2GB. However, every bit of spec info on the Prime I can find lists DDR2...so I don't know.
RussianMenace said:
I'm sure it can, hopefully they increase the battery capacity to compensate for increased power use. As far as the memory, Nvidia's site on Tegra 3 lists DDR3 (though its still running on a 32-bit bus which may or may not be an issue with 3d games), upto 2GB. However, every bit of spec info on the Prime I can find lists DDR2...so I don't know.
Click to expand...
Click to collapse
The Prime's RAM speed is considerably faster than the TF101.
If it does have room to expand, could we expand or upgrade the RAM?
doeboy1984 said:
If it does have room to expand, could we expand or upgrade the RAM?
Click to expand...
Click to collapse
Judging by the pictures, it doesn't look like the RAM will be removable or upgradeable (the RAM is the Elpida chip right next to the processor).
xTRICKYxx said:
The Prime's RAM speed is considerably faster than the TF101.
Click to expand...
Click to collapse
I never said it wasn't.
What I said is that both Tegra 2 and now Tegra 3 have a single 32-bit wide memory interface when compared to the two on the A5,Exynos,Qualcom, and OMAP4 chips. What that means is that theoretically it will have lower bandwidth which may cause problems with upcoming games, especially considering that you now have to feed extra cores and a beefier GPU. Now, whether or not it will actually be an issue...we will have to see.
Sad that the SGX543MP2 in the Ipad2 is still faster than the Tegra3's GPU. Apple is always ahead of the curve.. Just when Android devices started becoming as fast as the iPad1.. The iPad2 was released and remains to have one of the strongest SOCs out in the field.
Even for pure CPU benches.. the 1ghz dualcore A5 smokes most chips running faster clocks in dual core configs.
Regardless, this is still the most powerful Android device to date. Just disappointed that Nvidia, one of the king of GPU makers can't even compete with PowerVR.. a much smaller company with a lot less money.
Diversion said:
Sad that the SGX543MP2 in the Ipad2 is still faster than the Tegra3's GPU. Apple is always ahead of the curve.. Just when Android devices started becoming as fast as the iPad1.. The iPad2 was released and remains to have one of the strongest SOCs out in the field.
Even for pure CPU benches.. the 1ghz dualcore A5 smokes most chips running faster clocks in dual core configs.
Regardless, this is still the most powerful Android device to date. Just disappointed that Nvidia, one of the king of GPU makers can't even compete with PowerVR.. a much smaller company with a lot less money.
Click to expand...
Click to collapse
I would have to agree with you that Nvidia dropped the ball on their new GPU, at least on paper.
However, it's not as simple as having "omg wtf i > you" hardware that's the source of the performance. What Apple really has going for them is uniformity of hardware/software. Apple software is designed to work on very specific and strictly controlled hardware setup which allows for an incredible level of optimizations of software. This "closed loop" if software/hardware is what really drives the performance of the iProducts. Simply put, probably way over-simplified, but it let's them do more with less.
Diversion said:
Sad that the SGX543MP2 in the Ipad2 is still faster than the Tegra3's GPU. Apple is always ahead of the curve.. Just when Android devices started becoming as fast as the iPad1.. The iPad2 was released and remains to have one of the strongest SOCs out in the field.
Even for pure CPU benches.. the 1ghz dualcore A5 smokes most chips running faster clocks in dual core configs.
Regardless, this is still the most powerful Android device to date. Just disappointed that Nvidia, one of the king of GPU makers can't even compete with PowerVR.. a much smaller company with a lot less money.
Click to expand...
Click to collapse
Very good point. Also apple has the apps n games that showcase and utilize all this extra power. Even my original iPad has apps/games that I haven't seen Android dual core equivalents of. I love my iPad but I also own Atix dual core Tegra 2 phone. I know the open sourced Android will win out in the end.
I came across a good comment in the lenovo specs link that a member here posted in this thread.
"Google and NVidia need to seriously subsidize 3rd party app development to show ANY value and utility over iPad. Apple won't rest on its laurels as their GPU performance on the A5 is already ahead with games and APPs to prove it".
What do you all think about this? Not trying to thread jack as I see it's relevant to this thread also. What apps/games does Android have up it's sleeve to take advantage of this new Tegra3? Majority of Android apps/games don't even take advantage of tegra2 and similar SOC yet. Are we going to have all this extra power for a while without it never really being used to it's potential. Android needs some hardcore apps n games. iPad has all the b.s. Stuff also BUT has very hardcore apps n games also to use it to close to full potential. IMO my iPad 1 jail broken still trumps most of these Tegra 2 tablets out now. Not because of hardware specs, but because of the quality of apps n games I have. I've noticed Android is finally starting to get more hardcore games like ShadowGun, game loft games, etc.. I can't over clock or customize my iPad as extensively as Android but the software/apps/games I have are great. No, I don't want an ipad2 or ipad3. I want an Android tablet now because of more potential with it. Just like with anything in life, potential doesn't mean sh$& if it's not utilized and made a reality.
I was a windows mobile person first. Then I experienced dual booting with XDAndroid on my tilt 2, I loved it. Then I knew I wanted a real android phone or tablet. First Android tablet I owned, for only a day, was the Archos7IT. It was cool but returned it since it couldn't connect to my WMwifirouter, which uses ad-hoc network. So I researched n finally settled on taking a chance with the apple iPad. I use to be an apple hater to the max..lol. My iPad changed all of that. I still hate the closed system of apple but I had to admit, the iPad worked great for what I needed and wanted to do. This iPad, I'm writing this post on now, still works flawlessly after almost 2 years and it's specs are nowhere compared to iPad 2 or all these new dual core tablets out. I'm doing amazing stuff with only 256mb of ram..SMH I hated having to hook iPad up to iTunes for everything like music n videos. So I jail broke and got Ifiles, which is basically a very detailed root file explorer. I also have the USB n SD card adapter. So now I could put my content on my iPad myself without needing to be chained to iTunes. iTunes only good for software updates. I'm still on 4.2.1 jail broken firmware on iPad. Never bothered or really wanted to upgrade to the new IOS 5.01 out now. With all my jailbreak mods/tweaks, I've been doing most new stuff people are now being able to do. All apple did was implement jailbreak tweaks into their OS, for the most part.
Sorry for the long rant. I'm just excited on getting new Prime tegra3 tablet. I just hope the apps/games start rolling out fast that really take advantage of this power. And I don't just mean tegrazone stuff..lol. Android developers going to have to really step their game up once these new quad cores come out. Really even now with dual cores also. I'm a fan of technology in general. Competition only makes things better. Android is starting to overtake apple in sales or similar categories. Only thing is Android hasn't gotten on par with apple quality apps yet. Like the iPad tablet only apps are very numerous. Lots are b.s. But tons are very great also. I'm just hoping Amdroid tablet only apps will be same quality at least or better. I'm not looking to get new quad core tablet to play angry birds or other kiddy type games. I'm into productivity, media apps, and hardcore games, like Rage HD, NOVA2, Modern Combat 3, Order n Chaos, InfinityBlade, ShadowGun, etc.. All of which I have and more on my almost 2 year old iPad 1.
Asus, with being the first manufacturer to come out with quad core tablet and super IPS + display, might just be the last push needed to get things really rolling for Android, as far as high quality software amd tablet optimized OS goes. Can't wait to see how this plays out .
---------- Post added at 01:00 PM ---------- Previous post was at 12:58 PM ----------
RussianMenace said:
I would have to agree with you that Nvidia dropped the ball on their new GPU, at least on paper.
However, it's not as simple as having "omg wtf i > you" hardware that's the source of the performance. What Apple really has going for them is uniformity of hardware/software. Apple software is designed to work on very specific and strictly controlled hardware setup which allows for an incredible level of optimizations of software. This "closed loop" if software/hardware is what really drives the performance of the iProducts. Simply put, probably way over-simplified, but it let's them do more with less.
Click to expand...
Click to collapse
Great point, just as I was saying basically in my long post..lol
nook-color said:
You can read the White Papers on the Tegra 3 over on Nvidia's website. But the chip has a controller built into the chip that activates either the 4 cores, or the 1 core based on power demand of a given processing activity.
The quad vs single core are made out of different silicone materials, but same design structure in order to maximize the energy efficiency at the performance curve. The difference of Materials is more efficient at different power curves. So the 5th core is very efficient at low processing levels where it is actively being used.
It's pretty cool stuff
Click to expand...
Click to collapse
That is correct. Actually, the "5th" core is licensed with ARM A7 instructions set, the quads are A9.
RussianMenace said:
I would have to agree with you that Nvidia dropped the ball on their new GPU, at least on paper.
However, it's not as simple as having "omg wtf i > you" hardware that's the source of the performance. What Apple really has going for them is uniformity of hardware/software. Apple software is designed to work on very specific and strictly controlled hardware setup which allows for an incredible level of optimizations of software. This "closed loop" if software/hardware is what really drives the performance of the iProducts. Simply put, probably way over-simplified, but it let's them do more with less.
Click to expand...
Click to collapse
Again, I agree. Just like saying why Xbox360 and PS3 consoles can still push high quality graphics compared to a new high-end PC? Unity of hardware plays a big role there.
I have a $4000 custom PC. Sometimes I see my brother play the same games on his $250 Playstation 3 with performance and graphics very similar to my PC.
CyberPunk7t9 said:
I have a $4000 custom PC. Sometimes I see my brother play the same games on his $250 Playstation 3 with performance and graphics very similar to my PC.
Click to expand...
Click to collapse
That's because these days, most PC games are console ports.
GPU specs don't matter. The iPad has more and better games than Android tabs, and that won't change for the (1-yr) lifespan of the Teg3. Not to be a downer, but it's just reality.
The Prime is better at certain things. HDMI-out and USB host (NTFS) support makes it a pretty good HTPC, for one. But I wouldn't get into a pissing contest over games--unless of course you're talking about emus.
e.mote said:
GPU specs don't matter. The iPad has more and better games than Android tabs, and that won't change for the (1-yr) lifespan of the Teg3. Not to be a downer, but it's just reality.
The Prime is better at certain things. HDMI-out and USB host (NTFS) support makes it a pretty good HTPC, for one. But I wouldn't get into a pissing contest over games--unless of course you're talking about emus.
Click to expand...
Click to collapse
Is that true? NTFS support? Are you sure? Can you link me to a spec for that? If so then I can transfer files from my SD to an external NTFS without using Windows! That would be great for trips when I need to dump digital pics.

Whats next after quad-core?

So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Fasty12 said:
So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Click to expand...
Click to collapse
Well as octo core desktop CPUs havnt really caught on yet I would guess just better quad cores likely with more powerful GPUs
Tegra 3 is already very powerful, presuming the will increase ram and make them more battery efficient or even higher clock speed. 12 core tegra gpu is pretty amazing already and anything better must be godly
Sent from my HTC Desire using xda app-developers app
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Sounds like octo-mom..the debate.lives on.. battery vs performance...but to answer your question I think it would be hexa-core which is 6..let's wait and see what is to come...
Sent from my SGH-T989 using Tapatalk 2
s-X-s said:
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Click to expand...
Click to collapse
I agree. Cores are at there peak right now. The amount of CPU power we have especially in the higher end phones is enough to acomplish many, many things. RAM is somewhat of an issue especially since multitasking is a huge part of android. I really thing a 2gb RAM should be a standard soon. Also, better gpu's won't hurt
Sent from my HTC T328w using Tapatalk 2
If they decide to keep going on the core upgrade in the next two or so years, I see one of two possibilities happening:
1) Dual Processor phones utilizing either dual or quad cores.
or
2) Hexacore chips since on the desktop market there's already a few 6-core chips (though whether or not they would actually be practical in the phones architecture, no clue).
Generally speaking whatever they come out with next will either need a better battery material, or lower power processors.
I mean I'm pretty amazed by what my brother's HTC One X is capable of with the quad core, and here I am still sporting a single-core G2. But yes I would like to see more advancement in RAM usage, we got a nice bit of power, but how bout a standard 2GB ram for better multitasking?
I believe 2013 will be all about more efficient quad-cores.
May i ask what going from 1gb to 2gb will improve? Loading times?
hello everyone, could you tell me what is cuad core?
Quad core means that a processor has four processing units.
Because there are more, that means that a process, theoretically, gets executed 4 times faster.
Read more about it: http://simple.wikipedia.org/wiki/Multi-core_processor
Maybe i7 in mobile devices?
I'm sure it will stay at quad core cpu's, anything more is just overkill. They may introduce hyperthreading. It's going to boil down to efficiency.
Sent from my SPH-D700 using xda premium
I'd say the future lies in more efficient use of processors. Right now, Android is still far from optimized on multi-core processor-equipped devices. Project Butter is the start of a great movement by Google to optimize the operating system. Hopefully it spreads out to other OEMs and becomes the main focus for Android development.
Improving and optimizing current processors is the way hardware companies should go.
In my opinion, processor development is out running battery development. Optimized processors could reduce power consumption while preserving excellent speed and usability.
Sent from my Transformer TF101 using Tapatalk 2
building processors on more efficient ARM architectures is going to be the way to go from what I see......throwing four less efficient cores at a problem is the caveman method to dealing with it.....looking at you Samsung Exynos Quad based on tweaked A9 cores.....
the A15 based Qualcomm S4 Krait is more efficient on a clock for clock core for core basis and once the software catches up and starts using the hardware in full capacity, less more efficient cores will be preferred
I dont see anything beyond quads simply because they havent even scratched the surface of what can be done with a modern dual core processor yet.......throwing more cores at it only makes excuses for poor code.....i can shoot **** faster than water with a big enough pump......but that doesn't mean that's the better solution
We don't need more cores! Having more than 2 cores will not make a difference so quad cores are a waste of space in the CPU die.
Hyperthreading, duh.
More ram. Got to have the hardware before the software can be made to use it.
With the convergence of x86 into the Android core and the streamlining of low-power Atom CPUs, the logical step would be to first optimize the current software base for multi-core processors before marketing takes over with their stupid x2 multiplying game...
Not long ago, a senior Intel exec went on record saying that today, a single core CPU Android smartphone is perhaps better overall performing (battery life, user experience, etc) than any dual/quad-core CPU. Mind you, these guys seldom if ever stick out their neck with such bold statements, especially when not pleasing to the ear...
For those interested, you can follow this one (of many) articles on the subject: http://www.zdnet.com/blog/hardware/intel-android-not-ready-for-multi-core-cpus/20746
Android needs to mature, and I think it actually is. With 4.1 we see the focus drastically shifted to optimization, UX and performance with *existing/limited* resources. This will translate to devices beating all else in battery life, performance and graphics but since it was neglected in the first several iterations, it is likely we see 4.0 followed by 4.1 then maybe 4.2 before we hear/see the 5.0 which will showcase maturity and evolution of the experience.
Just my 2c. :fingers-crossed:

iPad 4 vs 5250 (Nexus 10 Soc) GLBenchmark full results. UPDATE now with Anandtech!!

XXXUPDATEXXX
Anandtech have now published the perfromance preview of the Nexus 10, lets the comparison begin!
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Well, the first full result has appeared on GLBenchmark for the iPad4, so I have created a comparison with the Samsung Arndale board, which uses exactly the same SoC as the Nexus 10, so will be very close in performance to Google's newest tablet. GLBenchmark, as its name suggests test Open GL graphics perrformance, which is important criteria for gaming.
Which device wins, click the link to find out.
http://www.glbenchmark.com/compare....ly=1&D1=Apple iPad 4&D2=Samsung Arndale Board
If you're really impatient, the iPad 4 maintains it lead in tablet graphics, the Nexus 10 may performance slightly better in final spec, but the underlying low-level performance will not change much.
I've also made a comparison between the iPad 3 & 4.
Interestingly the in-game tests GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) :, which is run independent of native screen resolution show the following
iPad 4: 48.6 FPS
iPad 3: 25.9 FPS
5250 : 33.7 FPS
So the iPad is twice as fast as its older brother, the Exynos will probably score nearer 40 FPS in final spec, with new drivers and running 4.2, the board runs ICS, however Jelly Bean did not really boost GL performance over ICS. What is interesting is that iPad 4, whose GPU is supposed to clocked at 500 MHz vs 250 MHz in the iPad 3 does not perform twice as fast in low-level test.
Fill Rate, triangle throughtput, vertex output etc is not double the power of the iPad 3, so although the faster A6 cpu helps, I reckon a lot of the improvement in the Egypt HD test is caused by improved drivers for the SGX 543 MP4 in the Pad 4. The Galaxy S2 received a big jump in GL performance when it got updated Mali drivers, so I imagine we should see good improvements for the T604, which is still a new product and not as mature as the SGX 543.
http://www.glbenchmark.com/compare....tified_only=1&D1=Apple iPad 4&D2=Apple iPad 3
I'd imagine the new new iPad would take the lead in benchmarks for now as it'll take Sammy and Google some time to optimize the beast, in the end however actual app and user interface performance is what matters, and reports are overwhelmingly positive on the Nexus 10.
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
Click to expand...
Click to collapse
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Damn..now I have to get an iPad.
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
Click to expand...
Click to collapse
In areas where the Mali 400 lacked performance, like fragment and vertex lit triangle output T604 is comfortably 5 x the performance, whereas in these low-level tests iPad is not a concrete 2x the power of iPad 3, but achieves twice the FPS in Egypt HD than its older brother. I suspect drivers are a big factor here, and Exynos 5250 will get better as they drivers mature.
hot_spare said:
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
Click to expand...
Click to collapse
I agree with most of what you have said. On the GPixel figure this is like ATI GPU teraflops figures always being much higher than Nvidia, in theory with code written to hit the device perfectly you might see that those high figures, but in reality the Nvidia cards with lower on paper numbers equaled or beat ATI in actual game FPS. It all depends on whether the underlying architecture is as efficient in real-world tests, vs maximum technical numbers that can't be replicated in actual game environments.
I think the current resolution of the iPad / Nexus 10 is actually crazy, and will would see prettier games with lower resolutions, the amount of resources needed to drive those high MP displays, means lots of compromises will be made in terms of effects / polygon complexity etc to ensure decent FPS, especially when you think that to drive Battlefield 3 at 2560 x 1600 with AA and high textures, requires a PC that burn 400+ watts of power, not a 10 watt SoC.
Overall when we consider that Nexus 10 has twice the RAM for game developers to use and faster CPU cores, games should look equally as nice as both, the biggest effect will be the level of support game developers provide for each device, the iPad will probably be stronger i that regard. Nvidia was able to coax prettier games out of Tegra 3 through developer support, hopefully Google won't forget the importance of this.
What's the point of speculation? Just wait for the device to be released and run all the test you want to get confirmation on performance. Doesn't hurt to wait
BoneXDA said:
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Click to expand...
Click to collapse
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
I can't wait to see this Exynos 5250 in a 2.0ghz quad-core variant in the semi near future... Ohhhh the possibilities. Samsung has one hell of a piece of silicon on their hand.
Chrome
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Turbotab said:
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Click to expand...
Click to collapse
True.. Chrome on mobile is still not upto desktop level yet. I believe it's v18 or something, right? The stock browser would have much better result in SunSpider/Browsermark. The N4 numbers looks even worse. Somewhere the optimizations isn't working.
The GLbenchmark tests are weird. Optimus G posts much better result than N4 when both are same hardware. It infact scores lower than Adreno 225 in some cases. This is totally whacked.
For N10, I am still wondering about fill rate. Need to check what guys say about this.
Is it running some debugging code in the devices at this time?
Turbotab said:
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
Click to expand...
Click to collapse
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
hung2900 said:
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Click to expand...
Click to collapse
I have to disagree, this is from ARM's info site.
The ARM Cortex-A15 MPCore processor has an out-of-order superscalar pipeline with a tightly-coupled low-latency level-2 cache that can be up to 4MB in size. The Cortex-A15 processor implements the ARMv7-A architecture.
The ARM Cortex-A9 processor is a very high-performance, low-power, ARM macrocell with an L1 cache subsystem that provides full virtual memory capabilities. The Cortex-A9 processor implements the ARMv7-A architecture and runs 32-bit ARM instructions, 16-bit and 32-bit Thumb instructions, and 8-bit Java bytecodes in Jazelle state.
http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.set.cortexa/index.html
Keion said:
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
Click to expand...
Click to collapse
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Harry GT-S5830 said:
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Click to expand...
Click to collapse
Indeed it does,but not in offscreen testing, where Anand made his proclamation.
Sent from my iPad Mini using Tapatalk
Hemlocke said:
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Click to expand...
Click to collapse
Nah, I think we can beat that too.
Drivers + OC.

[Q] Exynos 4412 Prime vs Intel?

I know that the ARM architecture is based on low-power mobile processing, but with upcoming release of the new Hardkernel ODROID development boards, at at a price tag of $89 for the ODROID-U2 and a Exynos 4412 Prime CPU, and supporting Ubuntu, I'm almost tempted to try it out as a desktop replacement.
But I'm wondering, how fast would it really be? Compared to Intel? Or is it impossible to say?
mvmacd said:
I know that the ARM architecture is based on low-power mobile processing, but with upcoming release of the new Hardkernel ODROID development boards, at at a price tag of $89 for the ODROID-U2 and a Exynos 4412 Prime CPU, and supporting Ubuntu, I'm almost tempted to try it out as a desktop replacement.
But I'm wondering, how fast would it really be? Compared to Intel? Or is it impossible to say?
Click to expand...
Click to collapse
From what I've read around the Odroid forums, the performance of the Prime is supposed to be on par with first gen i3 processors, which is pretty descent for a mobile processor. They are also working on an overclock app that will allow the cpu to scale to 2Ghz per core. I plan on getting the odroid-u2 and I already have the odroid-x:laugh:
jwhisl said:
From what I've read around the Odroid forums, the performance of the Prime is supposed to be on par with first gen i3 processors, which is pretty descent for a mobile processor. They are also working on an overclock app that will allow the cpu to scale to 2Ghz per core. I plan on getting the odroid-u2 and I already have the odroid-x:laugh:
Click to expand...
Click to collapse
I am considering getting one as well for basic testing and developmentment but I also one to use one as a media player it doesn't seem to support many files due to licensing issues
absolutely!
I have been eyeing on these mini-PCs/SBCs(Single board computer) for quite a long time.
Guess I will be getting this sweet little baby soon (ODROID-U2)
Apart from 2GB of RAM (which is ineffect the best at this price point and across all the similar boards) it beats the Tegra 3 by quite a margin

[Q] Will the gnex be a high end phone?

so will the gnex be a high end phone as in having desktop convergence? according to their website, for it to be a high end phone, it needs to have a quadcore a9 processor. otherwise it fulfills every other aspect. this phone has a dualcore a9 processor, so will there be desktop convergence. I really hope there will be I want to try it out.
Well from what I saw, every video demo etc. Was specifically the galaxy nexus. At one point I did see "a" phone connected using the desktop feature. Each phone I saw using the Ubuntu mobile has been the nexus.
Edit : I've also heard the specs released aren't final and just a figure for certain phones.
Sent from my Galaxy Nexus using Tapatalk 2
vwade79 said:
so will the gnex be a high end phone as in having desktop convergence? according to their website, for it to be a high end phone, it needs to have a quadcore a9 processor. otherwise it fulfills every other aspect. this phone has a dualcore a9 processor, so will there be desktop convergence. I really hope there will be I want to try it out.
Click to expand...
Click to collapse
the entry specs are
1Ghz Cortex A9
512MB – 1GB
4-8GB eMMC + SD
Multi-touch
hig end specs are
Quad-core A9 or Intel Atom
Min 1GB
Min 32GB eMMC + SD
Multi-touch
so i think The Galaxy Nexus is just a "normal" UbuntuPhone device
owain94 said:
the entry specs are
1Ghz Cortex A9
512MB – 1GB
4-8GB eMMC + SD
Multi-touch
hig end specs are
Quad-core A9 or Intel Atom
Min 1GB
Min 32GB eMMC + SD
Multi-touch
so i think The Galaxy Nexus is just a "normal" UbuntuPhone device
Click to expand...
Click to collapse
that's OEM specs, which doesn't mean much for download/install version potentially. The values of which would have been decided are best to give the full desktop experience - including downloading alot of desktop apps. Time will tell, but I suspect desktop version will run perfectly on GNEX, or any other internal memory only phone - it's just a matter of keeping your eye on space available when you are installing Ubuntu native apps.
I'm also wondering if the Desktop Feature will simply be disabled on devices with lower than recommended specs. I have seen some Ubuntu on Android videos using a Motorolla Atrix II which has exactly the same hardware as the GNex. It ran the desktop and even Ubuntu TV pretty good considering it was early in development (1 year ago) and running ontop of Android with bridges to access the phone settings and apps.

Categories

Resources