View unanswered posts | View active topics
It is currently Thu May 29, 2025 6:33 pm
Author |
Message |
themcman1
Doesn't have much of a life
Joined: Thu Apr 23, 2009 6:54 pm Posts: 572
|
AMD's strategy 
|
Sat Jun 13, 2009 5:54 pm |
|
 |
paulzolo
What's a life?
Joined: Thu Apr 23, 2009 6:27 pm Posts: 12251
|
Also Intel’s. The single core chips they put into some MacBooks and Minis are dual cores with one core disabled. That core may be faulty, or it may be OK. I’m not sure if it’s possible to reawaken the disabled core, but if you have a healthy 2 core chip, then you this is potentially good news.
|
Sat Jun 13, 2009 6:31 pm |
|
 |
big_D
What's a life?
Joined: Thu Apr 23, 2009 8:25 pm Posts: 10691 Location: Bramsche
|

I think the official clock speeds are artificially limited for a range of reasons. They various reasons all come down to cost.
The 3Ghz are fast enough for most general things that can be thrown at it by the average user, these days.
Enthusiasts can spend money and overclock, but the average user doesn't need and overclock and doesn't bother.
Going faster increases heat output, so you need better cooling - redesigned case, more air throughput, larger fans or liquid cooling. All of these make the machine less attractive for the average user. It either jumps up the price or increases the ambient noise of the machine - and many machines are too noisy to start with!
Going faster increases electrical usage and as the heat increases, the increase in consumption isn't linear, AFAIK. Therefore, overclocking uses more power cycle for cycle than the "official" speed, because more is being lost to inefficiency.
This also goes for the green argument.
For our current software, we have processors which, in general, are more than fast enough. The problem is, the programmers have become incredibly lazy over the years. They work on object orientated languages or things like .Net or the Core technologies from Apple make programming a doddle, it also makes the programs incredibly inefficient. A lot of programming skill, on creating tight, reliable and fast code has been lost from mainstream applications over the last 20 years or so.
The more power we have available, the more power we need, not necessarily to make up for increases in complexity of code, but for the laziness in the programming practices. There are some excellent coders out there, and a lot of very complex mathematical modeling is more efficient than a lot of general code, because it needs to work harder and needs more real power. It is often programs which don't need mega resources, in theory, that use more!
One unfortunate example was the Fennec team (Windows Mobile version of Firefox). The original web browsers would run in 1-2MB of RAM. The Fennec team railed against the Windows Mobile architecture, because it wouldn't let them use more than 32MB for a single app...
_________________ "Do you know what this is? Hmm? No, I can see you do not. You have that vacant look in your eyes, which says hold my head to your ear, you will hear the sea!" - Londo Molari
Executive Producer No Agenda Show 246
|
Sun Jun 14, 2009 9:49 am |
|
 |
pcernie
Legend
Joined: Sun Apr 26, 2009 12:30 pm Posts: 45931 Location: Belfast
|
Thanks for that David Fearon's regular column in PC Pro this month is about processing multithreaded apps - the problems and how Intel are inching towards trying to solve them...
_________________Plain English advice on everything money, purchase and service related:
http://www.moneysavingexpert.com/
|
Sun Jun 14, 2009 10:00 am |
|
 |
JohnSheridan
Doesn't have much of a life
Joined: Mon Apr 27, 2009 9:10 pm Posts: 1057
|
I understood it was all down to heat dissipation and that until they can figure a way around that problem we won't see much higher clock speed cpu's.
I think I read an article by Intel that they are going to concentrate more on multi-cores rather than pure clock speed which would explain why we should see some 6-core cpu's out end 2009 and then 8-core cpus in 2010
_________________
|
Sun Jun 14, 2009 10:03 am |
|
 |
pcernie
Legend
Joined: Sun Apr 26, 2009 12:30 pm Posts: 45931 Location: Belfast
|
You've maybe already read this: http://www.bit-tech.net/news/hardware/2 ... his-year/1I'm still running an E6300, and probably will do until it or the board dies 
_________________Plain English advice on everything money, purchase and service related:
http://www.moneysavingexpert.com/
|
Sun Jun 14, 2009 10:12 am |
|
 |
big_D
What's a life?
Joined: Thu Apr 23, 2009 8:25 pm Posts: 10691 Location: Bramsche
|

I agree. Sort of. Heat dissipation isn't a big problem, overclockers do it all the time... What is a problem is economic heat dissipation. The standard heatsink adds very little to the overall price of the CPU, but if the coolers were a couple of hundred quid each, needed extra fans in the case, redesigned air flow and better fans, it is going to make the next generation of faster processor machines much more expensive than the cheaper versions running at slower clock speeds... Overclockers take the additional costs of cooling, decent cases etc. into the equation when building a new machine. The average business is looking for the most powerful machine to fit into its €300-400 budget. If the extra cooling is going to add an additional €300 to the price, it is a non-starter. The average home user will be the same. Therefore getting more done with the same heat output is the ultimate design goal at the moment, whilst probably investing R&D into better ways of cooling processors, economically.
_________________ "Do you know what this is? Hmm? No, I can see you do not. You have that vacant look in your eyes, which says hold my head to your ear, you will hear the sea!" - Londo Molari
Executive Producer No Agenda Show 246
|
Sun Jun 14, 2009 10:17 am |
|
|
Who is online |
Users browsing this forum: No registered users and 69 guests |
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum
|
|