1. the_sleuth's Avatar
    With the arrival of dual core processors in smartphones in 2011 (such as 1.2 GHz cpu), there is a question that's been nagging at me for awhile.

    If we follow Moore's Law (says that every about 2 years we see a doubling of computer power), then quad-core smartphones should arrive by 2013 if not sooner.

    But here's my question, do you really need all of that processing power in a smartphone?

    In PCs and notebooks, quad-core cpus have been out for over 2 years in the consumer market. Yet most consumers can make do with dual-core or single core notebooks. There is a trade-off between battery longevity and cpu performance.

    In fact, with the rising popularity of tablets, consumers are opting for less cpu power in favor of battery longevity (averaging 10 hours seems to be the sweet spot or benchmark of satisfaction).

    In my opinion, battery technology would have to improve substantially for Moore's Law to continue in smartphones or else it's broken.

    Although QNX would be able to handle a 16 core cpu, where is the benefit in a smartphone?

    In the long run, as hardware will be less of a differentiating factor, RIM will no longer be playing catch-up to the competition. Consumers will focus more on the overall UI and app experience. RIM's QNX focus on gesture UI should help here. RIM has time to develop an ecosystem around QNX with greater security than the competition.

    Any comments?
    08-01-11 04:05 PM
  2. the_sleuth's Avatar
    Progress Hits Snag: Tiny Chips Use Outsize Power

    By JOHN MARKOFF

    For decades, the power of computers has grown at a staggering rate as designers have managed to squeeze ever more and ever tinier transistors onto a silicon chip doubling the number every two years, on average, and leading the way to increasingly powerful and inexpensive personal computers, laptops and smartphones.

    Now, however, researchers fear that this extraordinary acceleration is about to meet its limits. The problem is not that they cannot squeeze more transistors onto the chips they surely can but instead, like a city that cannot provide electricity for its entire streetlight system, that all those transistors could require too much power to run economically. They could overheat, too.

    The upshot could be that the gadget-crazy populace, accustomed to a retail drumbeat of breathtaking new products, may have to accept next-generation electronics that are only modestly better than their predecessors, rather than exponentially faster, cheaper and more wondrous.

    Simply put, the Next Big Thing may take longer to arrive.

    It is true that simply taking old processor architectures and scaling them wont work anymore, said William J. Dally, chief scientist at Nvidia, a maker of graphics processors, and a professor of computer science at Stanford University. Real innovation is required to make progress today.

    A paper presented in June at the International Symposium on Computer Architecture summed up the problem: even today, the most advanced microprocessor chips have so many transistors that it is impractical to supply power to all of them at the same time. So some of the transistors are left unpowered or dark, in industry parlance while the others are working. The phenomenon is known as dark silicon.

    As early as next year, these advanced chips will need 21 percent of their transistors to go dark at any one time, according to the researchers who wrote the paper. And in just three more chip generations a little more than a half-decade the constraints will become even more severe. While there will be vastly more transistors on each chip, as many as half of them will have to be turned off to avoid overheating.

    I dont think the chip would literally melt and run off of your circuit board as a liquid, though that would be dramatic, Doug Burger, an author of the paper and a computer scientist at Microsoft Research, wrote in an e-mail. But youd start getting incorrect results and eventually components of the circuitry would fuse, rendering the chip inoperable.

    The problem has the potential to counteract an important principle in computing that has held true for decades: Moores Law. It was Gordon Moore, a founder of Intel, who first predicted that the number of transistors that could be nestled comfortably and inexpensively on an integrated circuit chip would double roughly every two years, bringing exponential improvements in consumer electronics.

    If that rate of improvement lags, much of the innovation that people have come to take for granted will not happen, or will happen at a much slower pace. There will not be new PCs, new smartphones, new LCD TVs, new MP3 players or whatever might become the new gadget that creates an overnight multibillion-dollar industry and tens of thousands of jobs.

    In their paper, Dr. Burger and fellow researchers simulated the electricity used by more than 150 popular microprocessors and estimated that by 2024 computing speed would increase only 7.9 times, on average. By contrast, if there were no limits on the capabilities of the transistors, the maximum potential speedup would be nearly 47 times, the researchers said.

    Some scientists disagree, if only because new ideas and designs have repeatedly come along to preserve the computer industrys rapid pace of improvement. Dr. Dally of Nvidia, for instance, is sanguine about the future of chip design.

    The good news is that the old designs are really inefficient, leaving lots of room for innovation, he said.

    But other experts, not connected with Dr. Burgers research, acknowledged that he and the papers other authors from the University of Texas, the University of Washington and the University of Wisconsin had put their finger on a real problem.

    Shekhar Y. Borkar, a fellow at Intel Labs, called Dr. Burgers analysis right on the dot, but added: His conclusions are a little different than what my conclusions would have been. The future is not as golden as it used to be, but its not bleak either.

    Dr. Borkar cited a variety of new design ideas that he said would help ease the limits identified in the paper. Intel recently developed a way to vary the power consumed by different parts of a processor, making it possible to have both slower, lower-power transistors as well as faster-switching ones that consume more power.

    Increasingly, todays processor chips contain two or more cores, or central processing units, that make it possible to use multiple programs simultaneously. In the future, Intel computers will have different kinds of cores optimized for different kinds of problems, only some of which require high power.

    And while Intel announced in May that it had found a way to use 3-D design to crowd more transistors onto a single chip, that technology does not solve the energy problem described in the paper about dark silicon. The authors of the paper said they had tried to account for some of the promised innovation, and they argued that the question was how far innovators could go in overcoming the power limits.

    Where things fall between the two is a matter of opinion, Dr. Burger said.

    Chip designers have been struggling with power limits for some time. A decade ago it was widely assumed that it would be straightforward to increase chips clock speed, or the rate at which it makes calculations. Then the industry hit a wall at around three gigahertz, when the chips got so hot that they began to melt. That set off a frantic scramble for new designs.

    Today, some of the pioneering designers believe there is still plenty of room for innovation. One of them, David A. Patterson, a computer scientist at the University of California, Berkeley, called dark silicon a real phenomenon but said he was skeptical of the authors pessimistic conclusions.

    Its one of those If we dont innovate, were all going to die papers, Dr. Patterson said in an e-mail. Im pretty sure it means we need to innovate, since we dont want to die!

    http://www.nytimes.com/2011/08/01/sc...pagewanted=all
    08-01-11 04:13 PM
  3. Fubaz's Avatar
    well NVidia already has a quad core mobile gaming chip.... so it wont be long if the batteries will be able to power them

    Nvidia shows off Kal-El quad core mobile gaming muscle

    EDIT: first off, dont double post, edit your original. also there is no need to paste huge articles and then links at the bottom, just post the link.
    the_sleuth likes this.
    08-01-11 04:14 PM
  4. lnichols's Avatar
    Nvidia already planning to release quad core in Kal El. TI Omap 5 will have 2 Cortex A15 cores at up two 2GHz for power and to M4 cores for low power normal use and real time responsiveness (and it will be pin-for-pin replacement of the Omap 4 series). Is that much power needed in a smartphone? Probably not but I'm sure their will be manufacturers putting them in stuff and spec whores who eat it all up.
    08-01-11 04:18 PM
  5. _StephenBB81's Avatar
    Moores Law was not about doubling the speed, but it was about doubling the transistor count.

    Which could be done without seeing drastic speed increases,

    Posted from my CrackBerry at wapforums.crackberry.com
    08-01-11 04:26 PM
  6. the_sleuth's Avatar
    Kal-El and TI Omap 5 will be great for tablets with larger battery than smartphone. But will these cpus provide any benefit in a smartphone if battery life is shorter?

    Nvidia already planning to release quad core in Kal El. TI Omap 5 will have 2 Cortex A15 cores at up two 2GHz for power and to M4 cores for low power normal use and real time responsiveness (and it will be pin-for-pin replacement of the Omap 4 series). Is that much power needed in a smartphone? Probably not but I'm sure their will be manufacturers putting them in stuff and spec whores who eat it all up.
    08-01-11 04:38 PM
  7. SugarMouth's Avatar

    But here's my question, do you really need all of that processing power in a smartphone?
    Probably. We keep increasing the need for processing power as what we do with our phones get more complex.
    08-01-11 05:53 PM
  8. Phil DeLong's Avatar
    Probably. We keep increasing the need for processing power as what we do with our phones get more complex.
    And sometimes it's not even about need, but if we can get these devices to make our lives even easier, then why not?
    08-01-11 06:06 PM
  9. CGI's Avatar
    A phone that is super powered will replace a lot of other stuff.

    Think Asus PadPhone and Atrix. A dock-able device where one purchase serves multiple purposes.

    ^ That is in our future folks!
    08-01-11 07:16 PM
  10. albee 1's Avatar
    Agree with sleuth, unfortunately battery capacity is not doubling every 2 years.
    S'mores Law. Haha

    Posted from my CrackBerry at wapforums.crackberry.com
    08-01-11 07:30 PM
  11. Xterra2's Avatar
    The law says thee number of transistors would be doubled and NOT the speed or power
    And by next month we would have qual core tegra3 ( kal el ) next year we would have dual core 2ghz and laterr in the year 2.5ghz quad core

    Posted from my CrackBerry at wapforums.crackberry.com
    08-01-11 11:28 PM
LINK TO POST COPIED TO CLIPBOARD