Jump to content

Low power: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Electronics: mention "dynamic logic" and "static logic"
Line 34: Line 34:


The result of heat dissipation on state change is to limit the amount of computation that may be performed on a given power budget. While device shrinkage can reduce some of the parasitic capacitances, the number of devices on an integrated circuit chip has increased more than enough to compensate for reduced capacitance in each individual device.
The result of heat dissipation on state change is to limit the amount of computation that may be performed on a given power budget. While device shrinkage can reduce some of the parasitic capacitances, the number of devices on an integrated circuit chip has increased more than enough to compensate for reduced capacitance in each individual device.

Some circuits -- [[dynamic logic (digital logic)|dynamic logic]] -- require some minimum clock rate in order to function properly, wasting "dynamic power" even when it has nothing to do.
Other circuits -- most famously, the [[RCA 1802]], but also many later chips such as the [[WDC 65C02]], the 80C85, the [[Freescale 68HC11]], and some other [[CMOS]] chips -- use "fully static logic" that have no minimum clock rate, but can "stop the clock" and hold their state indefinitely.
When the clock is stopped, such circuits use no "dynamic power", but they still have a small static power consumption caused by "leakage current".


As circuits shrink, [[Subthreshold leakage]] current is becoming much more important. This leakage current results in power consumption even when no switching is taking place (static power consumption), and with modern chips this current is frequently more than 50% of power used by the IC. This loss can be reduced by raising the [[threshold voltage]] and lowering the supply voltage. Both of these changes slow the circuit down significantly, and so some modern low-power circuits use dual supply voltages to provide speed on critical parts of the circuit, and lower power on non-critical paths. Some circuits even use different transistors (with different threshold voltages) in different parts of the circuit in an attempt to further reduce power consumption without significant performance loss.
As circuits shrink, [[Subthreshold leakage]] current is becoming much more important. This leakage current results in power consumption even when no switching is taking place (static power consumption), and with modern chips this current is frequently more than 50% of power used by the IC. This loss can be reduced by raising the [[threshold voltage]] and lowering the supply voltage. Both of these changes slow the circuit down significantly, and so some modern low-power circuits use dual supply voltages to provide speed on critical parts of the circuit, and lower power on non-critical paths. Some circuits even use different transistors (with different threshold voltages) in different parts of the circuit in an attempt to further reduce power consumption without significant performance loss.

Revision as of 21:27, 11 February 2008

In electronics, the term low-power means one of two things about a device:

Radio

J. H. Snider and Lawrence Lessig say that low power "smart" radio is inherently superior to standard broadcast radio.

"Technologists are increasingly discussing a related kind of gain called 'cooperation gain.' ... think about a party. If I need to tell you that it's time to leave, I could choose to shout that message across the room. Shouting, however, is rude. So instead, imagine I choose to whisper my message to the person standing next to me, and he whispered it to the next person, and she to the next person, and so on. This series of whispers could get my message across the room without forcing me to shout." -- "Wireless Spectrum: Defining the 'Commons'" by Lawrence Lessig 2003 (mirror)

"if nodes repeat each other's traffic. If I want to talk to someone across the room, I don't have to shout. I can just whisper it to someone near me, who can pass it on, and so on. ... as we add more transmitters, the total capacity goes up slightly, but we still have to face the fact that each transmitter's capacity goes down (just slower). Even better, we all end up using less energy (since we don't have to transmit as far), saving battery life." -- Open Spectrum: A Global Pervasive Network by Aaron Swartz

"Every time a broadcaster receives a license, the amount of available spectrum goes down. ... New technology, however, increases bandwidth with the number of users." -- "Why Open Spectrum Matters: The End of the Broadcast Nation" by David Weinberger

"If we lose ... open spectrum, we're also going to lose the open Internet" -- "The war against open spectrum" by Dana Blankenhorn 2007

Electronics

The density and speed of integrated circuit computing elements has increased roughly exponentially for a period of several decades, following a trend described by Moore's Law. While it is generally accepted that this exponential improvement trend will end, it is unclear exactly how dense and fast integrated circuits will get by the time this point is reached. Working devices have been demonstrated that were fabricated with a MOSFET transistor channel length of 6.3 nanometres using conventional semiconductor materials, and devices have been built that used carbon nanotubes as MOSFET gates, giving a channel length of approximately one nanometre.

The ultimate density and computing power of integrated circuits are limited primarily by power dissipation concerns.

An integrated circuit chip contains many capacitive loads, formed both intentionally (as is the case with gate to channel capacitance) and unintentionally (between any conductors that are near each other but not electrically connected). Changing the state of the circuit causes a change in the voltage across these parasitic capacitances, which involves a change in the amount of stored energy. As the capacitive loads are charged and discharged through resistive devices, an amount of energy comparable to that stored in the capacitor is dissipated as heat.

The result of heat dissipation on state change is to limit the amount of computation that may be performed on a given power budget. While device shrinkage can reduce some of the parasitic capacitances, the number of devices on an integrated circuit chip has increased more than enough to compensate for reduced capacitance in each individual device.

Some circuits -- dynamic logic -- require some minimum clock rate in order to function properly, wasting "dynamic power" even when it has nothing to do. Other circuits -- most famously, the RCA 1802, but also many later chips such as the WDC 65C02, the 80C85, the Freescale 68HC11, and some other CMOS chips -- use "fully static logic" that have no minimum clock rate, but can "stop the clock" and hold their state indefinitely. When the clock is stopped, such circuits use no "dynamic power", but they still have a small static power consumption caused by "leakage current".

As circuits shrink, Subthreshold leakage current is becoming much more important. This leakage current results in power consumption even when no switching is taking place (static power consumption), and with modern chips this current is frequently more than 50% of power used by the IC. This loss can be reduced by raising the threshold voltage and lowering the supply voltage. Both of these changes slow the circuit down significantly, and so some modern low-power circuits use dual supply voltages to provide speed on critical parts of the circuit, and lower power on non-critical paths. Some circuits even use different transistors (with different threshold voltages) in different parts of the circuit in an attempt to further reduce power consumption without significant performance loss.

Another method used to reduce static power consumption is the use of sleep transistors to disable entire blocks when not in use. By shutting down a leaky functional block until it is used, leakage current can be reduced significantly. For some embedded systems that only function for short periods at a time, this can dramatically reduce power consumption. Since systems that are dormant for long periods of time and "wake up" to perform a periodic activity are often in isolated locations monitoring some sort of activity, they are generally battery or solar powered and power consumption is a key design factor.

Two other approaches exist to lowering the power cost of state changes. One is to reduce the operating voltage of the circuit, or to reduce the voltage change involved in a state change (making a state change only change node voltage by a fraction of the supply voltage — Low voltage differential signaling). This approach is limited by thermal noise within the circuit. There is a characteristic voltage proportional to the device temperature and to the Boltzmann constant, which the state switching voltage must exceed in order for the circuit to be resistant to noise. This is typically on the order of 50–100 mV, for devices rated to 100 degrees Celsius external temperature (about 4 kT, where T is the device's internal temperature in kelvins and k is the Boltzmann constant).

The second approach is to attempt to provide charge to the capacitive loads through paths that are not predominantly resistive. This is the principle behind adiabatic circuits. The charge is supplied either from a variable-voltage inductive power supply, or by other elements in a reversible logic circuit. In both cases, the charge transfer must be primarily regulated by the non-resistive load. As a practical rule of thumb, this means the rate of change of a signal must be much slower than that dictated by the RC time constant of the circuit being driven. In other words, the price of reduced power consumption per unit computation is reduced absolute speed of computation.

In practice, while adiabatic circuits have been built, it has proven very difficult to use it to reduce computation power substantially in practical circuits.

Lastly, there are several techniques used to reduce the number of state changes associated with any given computation. For clocked logic circuits, the technique of clock gating is used, to avoid changing the state of functional blocks that aren't required for a given operation. As a more extreme alternative, the asynchronous logic approach implements circuits in such a way that an explicit externally supplied clock is not required. While both of these techniques are used to varying extents in integrated circuit design, the limit to practical applicability of each appears to have been reached.

If current trends continue, "Energy costs, now about 10% of the average IT budget, could rise to 50% ... by 2010" ("Averting the IT Energy Crunch" by Rachael King).

Examples

Radio

Electronics