This article appears to have some factual basis, based on manufacturing economies of scale. However it also anticipates an 8 X increase in Energy densities or capacity (performance of battery) in a period of 10 years, which is very ambitious and may not be realistic. However, while I cannot dispute the basic premise of the article, that there will a significant decrease in the price of batteries, the amount of decrease they claim will happen, is startling.
Many decades ago, Gordon Moore (co-founder of Intel and Fairchild Semi-Conductors) came up with what is know as More's law.
Moore's law is the observation that
the number of
transistors in a dense
integrated circuit doubles about every two years (
https://en.wikipedia.org/wiki/Moore's_law). What it says is that performance more or less doubles every two years and reality tracked this for many years, though the pace has slowed down in recent years. Now battery chemistry does not follow Moore's law exactly, that performance does not double every 2 years. In an older report (2007) MIT report expects that battery performance will double over a decade rather than 2 years (
https://www.technologyreview.com/s/407345/nanobatteries-and-moores-law/)
Meanwhile, advances in energy capacity and calendar life are coming from improvements in electrode materials, sometimes using nanoscale particles. (See “3M’s Higher-Capacity Lithium-Ion Batteries,” “Powering GM’s Electric Vehicles,” and “Battery Breakthrough?”) These might lead to a doubling of energy capacity within a decade, which could go far toward improving electronic devices and cars. Battery performance could double in the next 10 years, according to one MIT scientist. (See “How Future Batteries Will Be Longer-Lasting and Safer.”) That’s no Moore’s Law, but, combined with more-efficient devices, it could make a big difference.
One part of Rocky Mountain study makes sense. If there is more demand, larger plants are set up, there is more investment in manufacturing technology, there are economies of scale and this drives down cost. There is no expectation here that the base technology has changed, only that it has become cheaper to make. However as the article points out, more interest in the product, more interest in improving the base technology. In the electronics world, by increasing circuit density, manufacturers were able to substantially improve performance and miniaturize devices without a proportionate increase in costs. So cost of ICs came down both due to manufacturing efficiencies (large volumes as the ICs were used more) as well as to the effect of higher packing densities.
This is shown in the decline in the costs of the Lithium-Ion batteries over time as shown in the graph below. Most of it is due to Wrights law on volume driven efficiencies, a small portion could be due to increases in battery density, estimated at about 3% a year (source “The equivalent of Moore’s law for batteries is that they improve about 3% every year,” says Mike Toney, a researcher at Stanford Synchrotron Radiation Lightsource who works on battery technology.
https://www.forbes.com/sites/mikemo...eady-for-the-battery-revolution/#4370c6db19ca)
-2 )
(
https://ark-invest.com/research/wrights-law )
While a Moore’s Law style forecast deemed lithium-ion battery technology mature more than ten years ago, Wright’s Law correctly anticipated a reacceleration in cost declines and a resurgence in demand roughly five years ago. The decline in prices has opened up new segments of the auto market to lithium-ion batteries which, in turn, is pushing them toward an even larger market, utility-scale energy storage.
While the cost decline in batteries after the launch of Tesla’s Model S appears discontinuous when presented as a function of time, when recast using Wright’s Law – costs presented as a function of unit production – the cost drop appears neither discontinuous nor particularly surprising.
So I can see that part of Rocky Mountains Institute hypothesis has factual basis. The second part of the Rocky Mountain article is where Moore's Law analogy comes in is the battery density i.e. improving performance by having a battery be able to hold more charge in the same volume.
Experts now believe that energy density or capacity of batteries will improve at greater than the 3% a year where it has historically been (
https://www.forbes.com/sites/mikemo...eady-for-the-battery-revolution/#4370c6db19ca), due to the tremendous interest in improving battery performance through higher charge densities. However the Rocky Mountain study is looking for an 800% improvement in 10 years, which means annual average improvement rate of 23% a year for the next 10 years. Now it is not going to be linear, sometimes the growth rate in energy density will be faster, sometimes slower. However, what is assumed is significantly higher than the past. While I will be glad if an 8 fold improvement is achieved in 10 years, but it seems a rather daunting task about battery performance. Yes it happened in IC design, but will it happen in batteries? That is the 64 million dollar question.