"Lithium ion battery recharges in one minute"

Toshiba Corp. has developed a lithium-ion battery the company said features the short rechargeable time of capacitors and the energy capacity of conventional lithium-ion batteries. Toshiba's new battery can recharge 80 percent of the battery's energy capacity in one minute, approximately 60 times faster than the typical Li-ion batteries.

APRIL FOOL!!!!!
 
Could be.

...if neowin.net, eetimes.com and beyond3d.com have faked dates. Or if eetimes.com start their april fools jokes a few days early.
 
Basic said:
...charging 5 milliwatts in 5 seconds.
5 mW here should probably be 5 mAh. (Maybe 5 mWh, but that's not a common unit for battery charging).

But if that unit was right, you shouldn't divide it by the time it was applied, it's 5 mW all the time. So your 1mW/s is a unit that doesn't make much sense (Even though the unit is "correct" after calculating 5mW / 5s.)

Then you treat that 1 mW/s as 1 mW when you calculate 300µA @3.3V (after the edit).

1 mW when charging, and then running it for 10min / 5s = 120 times longer means that the player runs on less than 1mW / 120 ~= 8 µW.


You should also notice that even if we change the unit in the article to 5 mAh, that still implies that a full charging would take at least 10 min.

Charging a 600mAh cell to 80% of its capacity in 1 min, means a current of at least (the energy loss is not included):
80% * 600mAh * 60min/h = 28.8 A

That's a lot.

Seen that ";)"?

I didn't really care, I just simply took those 5mW as power, nothing else. Actually those 300 µA is what the battery would deliver @3.3V if it had 5mW of power.

And mW/s surely does make sense, although it's another topic (P = dW/dt = U*I). mAh is not a unit for charge, but charge/time.

Your calculation of 28.8A is correct, but that's not how charging works. For charging, you'll always use higher voltage than the battery's, in this case min. 5V. Considering the power, that would be about 100W needed. One quarter of what a usual PC needs. If you don't use the "constant-current" method and go with the higher (pulsed?) voltage like 12V, then you'll be needing some 8.3A (provided the battery can survive those 12V, of course).
 
You keep mixing up the units for power and energy, and perhaps charge and current.

Just as you said:
P = dW/dt = U*I
or in words:
Power is the time-derivative of energy (energy = Work), and is equal to the voltage over a circuit times the current through it.

But note that this commonly used variable name W (as in Work) isn't the same as the unit 1 W (as in Watt).

The unit for power (P) is 1 W = 1 Watt = 1 VA (The unit 1 VA has another tricky use with alternating current, but its OK here with DC.)
The unit for energy or work (E or W) is 1 J = 1 Joule = 1 Ws = 1 Watt second ~= 0.00000028 kWh

So if you want something with the unit mW/s it would for instance be dP/dt = d^2(W)/dt^2 = d(U*I)/dt. While you're free to do that calculation if you want to, it's uninteresting in the context where you used the unit 1 mW/s.

300µA * 3.3V = 0.99 mW, not 5mW


Current (I) is the time derivate of charge (Q): I = dQ/dt
The unit for current is 1 A.
The unit for charge is 1 Coloumb = 1 A * 1 s = 1000 mA * (1/3600) h ~= 0.28 mAh.

One source of confusion is a common parameter of batteries. In battery charging litterature you often see references to 'c'. This c often has a rather sloppy definition, not pointing out the difference between charge and current.

A battery with, say, a 2500 mAh capacity (= the charge you can get out of it) is said to have c = 2500 mA. Note the dropped 'h'. It's usually not pointed out that the capacity and c has different units. 2500 mAh is a charge, c is a current.


And yes, when you charge a battery you always use a higher voltage than the battery's, otherwise you wouldn't get the current flowing. But the energy from this overvoltage is lost as heat in the internal resistance of the battery. With all ways I've seen to charge batteries you'll always need to charge it with a larger charge than what you can get out of it. On NiMh and NiCd it's typically 20% extra. I don't remember how much more it is for Li-Ion, but it's definitely more in than out.

So to get a more realistic current than the optimistic 28.8 A above, you should factor in the energy losses to heat. With the NiMh and NiCd factor, it would be 34.6 A. Let's reduce that some just because I'm unsure on the Li-Ion factor, let's say at least 30 A.

At 5V, that's at least 150W, and that's a whole lt more than a quarter of a usual PC (don't mix up max power ratings with actual usage). Having that kind of battery charger for your small HD MP3 player is rather extreme.

I've seen a few versions of pulsed battery chargers, and the pulsing is to measure the idle voltage, and to let the battery "rest" a short while between longer periods of charging. The energy from the overvoltage still just generate heat, and doesn't result in any lower current or charge into the battery.
 
_xxx_ said:
For charging, you'll always use higher voltage than the battery's, in this case min. 5V.
GACK! Not for lithium ION, that's for sure.

A constant voltage current source of 4.2V is how you charge a lithium ion battery.
 
RussSchultz said:
_xxx_ said:
For charging, you'll always use higher voltage than the battery's, in this case min. 5V.
GACK! Not for lithium ION, that's for sure.

No idea, but thx for the info :)

But however, answering Basic's analysis: I already said your calculation was right. For mixing up units, I last had to deal with that stuff like 7-8 years ago, so damn the units, the logic is essentially comprehensible...

What I know of charging "usual" batteries is higher voltage but in form of PWM. Saves some losses, but I don't think that's the only reason. I forgot it, though :oops:
 
Back
Top