How Super Algorithms Will Make Future iPhones & iPads Charge Twice As Fast | Cult of Mac

How Super Algorithms Will Make Future iPhones & iPads Charge Twice As Fast



Plug in your iPhone or iPad and charge it up, and you’ll notice that while the first 80% or so will go by pretty fast, they actually kind of suck at charging up that last 20%, taking a lot more time to do so than it feels like they should.

There’s a reason for this. Charging batteries up to “full” is a complicated process. There’s no real way to tell if a battery is completely “full” so all you can do is measure the voltage, which (and this is a vast simplification) tells you how much resistance is being met when you try to put more electricity into the battery.

That’s why it takes so long for an iPhone to charge that last 20%. It charges full blast until it measures a certain voltage, then goes into what’s called “trickle mode” to slowly allow small sips of electricity into the battery until it thinks, based upon some software calculations, that the battery is more or less full. But a new algotihm could make the time it takes to charge your iPhone or iPad go by a lot faster.

Researchers from the University of California in San Diego are saying that new algotihms they’ve developed will cut normal battery charging times in half for standard lithium-ion batteries, just like the ones your iPhone, iPad or MacBook use.

Here’s the deal. Instead of just relying on voltage measurement to tell how full a battery is, the new algotihms actually model exactly where lithium ions are within the battery cells to allow for a much more accurate battery reading.

How does that lead to faster charging? If a device has a better idea of how “full” a battery is by knowing where it’s lithium ions are, it doesn’t have to go into trickle mode quite so quickly. It can just blast juice into the battery until the point when it’s time to stop.

The best part about this advance is that it’s not decades off, or the kind of thing that is so expensive and tricky to pull off that we’ll likely never see it in a consumer product. The researchers behind the algorithms are confident that “this technology is going into products that people will actually use.” Let’s hope that’s iPhones, iPads and MacBook Airs.

Source: UCSD
Via: Gizmodo


Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.