Tag Archives: electical engineering

Battery Breakthrough

New battery could change world

Inside Ceramatec’s wonder battery is a chunk of solid sodium metal mated to a sulphur compound by an extraordinary, paper-thin ceramic membrane. The membrane conducts ions — electrically charged particles — back and forth to generate a current. The company calculates that the battery will cram 20 to 40 kilowatt hours of energy into a package about the size of a refrigerator, and operate below 90 degrees C.

This may not startle you, but it should. It’s amazing. The most energy-dense batteries available today are huge bottles of super-hot molten sodium, swirling around at 600 degrees or so. At that temperature the material is highly conductive of electricity but it’s both toxic and corrosive. You wouldn’t want your kids around one of these.

The essence of Ceramatec‘s breakthrough is that high energy density (a lot of juice) can be achieved safely at normal temperatures and with solid components, not hot liquid.

Ceramatec says its new generation of battery would deliver a continuous flow of 5 kilowatts of electricity over four hours, with 3,650 daily discharge/recharge cycles over 10 years. With the batteries expected to sell in the neighborhood of $2,000, that translates to less than 3 cents per kilowatt hour over the battery’s life. Conventional power from the grid typically costs in the neighborhood of 8 cents per kilowatt hour.

A small three-bedroom home in Provo might average, say, 18 kWh of electric consumption per day in the summer — that’s 1,000 watts for 18 hours. A much larger home, say five bedrooms in the Grandview area, might average 80 kWh, according to Provo Power.;Either way, a supplement of 20 to 40 kWh per day is substantial. If you could produce that much power in a day — for example through solar cells on the roof — your power bills would plummet.

Ceramatec’s battery breakthrough now makes that possible.

Clyde Shepherd of Alpine is floored by the prospect. He recently installed the second of two windmills on his property that are each rated at 2.4 kilowatts continuous output. He’s searching for a battery system that can capture and store some of that for later use when it’s calm outside, but he hasn’t found a good solution.

“This changes the whole scope of things and would have a major impact on what we’re trying to do,” Shepherd said. “Something that would provide 20 kilowatts would put us near 100 percent of what we would need to be completely independent. It would save literally thousands of dollars a year.”

Very interesting stuff. If they can take it from the lab to production this could be a great thing, I would like one.

Related: Recharge Batteries in SecondsUsing Virus to Build BatteriesBlack and Decker Codeless Lawn Mower Review

Graphene: Engineered Carbon

A material for all seasons

Graphene, a form of the element carbon that is just a single atom thick, had been identified as a theoretical possibility as early as 1947.

Its unique electrical characteristics could make graphene the successor to silicon in a whole new generation of microchips, surmounting basic physical constraints limiting the further development of ever-smaller, ever-faster silicon chips.

But that’s only one of the material’s potential applications. Because of its single-atom thickness, pure graphene is transparent, and can be used to make transparent electrodes for light-based applications such as light-emitting diodes (LEDs) or improved solar cells.

Graphene could also substitute for copper to make the electrical connections between computer chips and other electronic devices, providing much lower resistance and thus generating less heat. And it also has potential uses in quantum-based electronic devices that could enable a new generation of computation and processing.

“The field is really in its infancy,” says Michael Strano, associate professor of chemical engineering who has been investigating the chemical properties of graphene. “I don’t think there’s any other material like this.”

The mobility of electrons in graphene — a measure of how easily electrons can flow within it — is by far the highest of any known material. So is its strength, which is, pound for pound, 200 times that of steel. Yet like its cousin diamond, it is a remarkably simple material, composed of nothing but carbon atoms arranged in a simple, regular pattern.

“It’s the most extreme material you can think of,” says Palacios. “For many years, people thought it was an impossible material that couldn’t exist in nature, but people have been studying it from a theoretical point of view for more than 60 years.”

Related: Very Cool Wearable Computing Gadget from MITNanotechnology Breakthroughs for Computer ChipsCost Efficient Solar Dish by MIT StudentsSuperconducting Surprise

Google Server Hardware Design

Ben Jai, Google Server Platform Architect, discusses the Google server hardware design. Google has designed their own servers since the beginning and shared details this week on that design. As we have written previously Google has focused a great deal on improving power efficiency.

Google uncloaks once-secret server

Google’s big surprise: each server has its own 12-volt battery to supply power if there’s a problem with the main source of electricity. The company also revealed for the first time that since 2005, its data centers have been composed of standard shipping containers–each with 1,160 servers and a power consumption that can reach 250 kilowatts.

Efficiency is another financial factor. Large UPSs can reach 92 to 95 percent efficiency, meaning that a large amount of power is squandered. The server-mounted batteries do better, Jai said: “We were able to measure our actual usage to greater than 99.9 percent efficiency.”

Related: Data Center Energy NeedsReduce Computer WasteCost of Powering Your PCCurious Cat Science and Engineering Search

The Chip That Designs Itself

The chip that designs itself by Clive Davidson , 1998

Adrian Thompson, who works at the university’s Centre for Computational Neuroscience and Robotics, came up with the idea of self-designing circuits while thinking about building neural network chips. A graduate in microelectronics, he joined the centre four years ago to pursue a PhD in neural networks and robotics.

To get the experiment started, he created an initial population of 50 random circuit designs coded as binary strings. The genetic algorithm, running on a standard PC, downloaded each design to the Field Programmable Gate Arrays (FPGA) and tested it with the two tones generated by the PC’s sound card. At first there was almost no evidence of any ability to discriminate between the two tones, so the genetic algorithm simply selected circuits which did not appear to behave entirely randomly. The fittest circuit in the first generation was one that output a steady five-volt signal no matter which tone it heard.

By generation 220 there was some sign of improvement. The fittest circuit could produce an output that mimicked the input – wave forms that corresponded to the 1KHz or 10KHz tones – but not a steady zero or five-volt output.

By generation 650, some evolved circuits gave a steady output to one tone but not the other. It took almost another 1,000 generations to find circuits that could give approximately the right output and another 1,000 to get accurate results. However, there were still some glitches in the results and it took until generation 4,100 for these to disappear. The genetic algorithm was allowed to run for a further 1,000 generations but there were no further changes.

See Adrian Thompson’s home page (Department of Informatics, University of Sussex) for more on evolutionary electronics. Such as Scrubbing away transients and Jiggling around the permanent: Long survival of FPGA systems through evolutionary self-repair:

Mission operation is never interrupted. The repair circuitry is sufficiently small that a pair could mutually repair each other. A minimal evolutionary algorithm is used during permanent fault self-repair. Reliability analysis of the studied case shows the system has a 0.99 probability of surviving 17 times the mean time to local permanent fault arrival. Such a system would be 0.99 probable to survive 100 years with one fault every 6 years.

Very cool.

Related: Evolutionary DesignInvention MachineEvo-Devo

von Neumann Architecture and Bottleneck

We each use computers a great deal (like to write this blog and read this blog) but often have little understanding of how a computer actually works. This post gives some details on the inner workings of your computer.
What Your Computer Does While You Wait

People refer to the bottleneck between CPU and memory as the von Neumann bottleneck. Now, the front side bus bandwidth, ~10GB/s, actually looks decent. At that rate, you could read all of 8GB of system memory in less than one second or read 100 bytes in 10ns. Sadly this throughput is a theoretical maximum (unlike most others in the diagram) and cannot be achieved due to delays in the main RAM circuitry.

Sadly the southbridge hosts some truly sluggish performers, for even main memory is blazing fast compared to hard drives. Keeping with the office analogy, waiting for a hard drive seek is like leaving the building to roam the earth for one year and three months. This is why so many workloads are dominated by disk I/O and why database performance can drive off a cliff once the in-memory buffers are exhausted. It is also why plentiful RAM (for buffering) and fast hard drives are so important for overall system performance.

Related: Free Harvard Online Course (MP3s) Understanding Computers and the InternetHow Computers Boot UpThe von Neumann Architecture of Computer SystemsFive Scientists Who Made the Modern World (including John von Neumann)