Great work, Max!!! Resource use is an important angle to consider. The drive to increase the power of our system is not without significant consequences for our planet or the future of humanity.
Such a great article Max. It's such an interesting question. A couple of additional thoughts:
* To your point about energy efficiency, I wonder how much AI moves on device - all the big foundational models are expensive to train (both chip wise, plus energy wise), and much of the compute today to answer queries is done in data centres. But once models get smaller, and computers/phones continue to increase in their ability to process on device, I wonder how much moves on device (and as part of the day to day, which doesn't impact much your need to charge your device daily). Interesting research question that could also form part of investment thesis. Could Apple just grab a huge chunk of this market if all the AI is running directly on iPhones?
* A lot of what Google has said about AI is that while its driving an increase in their emissions due to more data centre energy use, there is a lot of potential for AI to help in solving climate challenges, helping creating companies that are providing solutions, coming up with unique ideas, etc. A lot of doomers / degrowthers discard the possibility of immense productivity gains from AI that helps solve the climate crisis, even though to get their, it might exacerbate it a little.
Thanks so much for the comment! Your observation about on-device AI processing is spot-on. As AI models become more efficient and our devices more powerful, we could indeed see a shift towards more on-device processing. The reality is likely more complex, with a mix of on-device and cloud-based AI coexisting, each optimized for different use cases.
As AI capabilities expand, models often become more complex. While some tasks might move to devices, cutting-edge AI may still require powerful data centers.
This situation reminds me of two relevant theories: (1) Jevons Paradox suggests that as technology becomes more efficient, we tend to use it more. So, more efficient on-device AI might lead to increased AI use overall, potentially offsetting energy savings and (2) The laws of thermodynamics remind us there's no free lunch. In energy terms, more complex computations will require more energy, regardless of where they're performed.
These factors could impact how companies like Apple position themselves in the AI market, as you astutely pointed out. The balance between on-device and cloud processing will likely be a key consideration for tech companies moving forward.
Hydrogen is a battery, not a fuel. So, of course, is natural gas, but it was charged eons ago. To use hydrogen you have to generate it, which requires power.
And power is not the only hidden cost. There are also water and heat. These will likely be bigger issues.
Great work, Max!!! Resource use is an important angle to consider. The drive to increase the power of our system is not without significant consequences for our planet or the future of humanity.
Such a great article Max. It's such an interesting question. A couple of additional thoughts:
* To your point about energy efficiency, I wonder how much AI moves on device - all the big foundational models are expensive to train (both chip wise, plus energy wise), and much of the compute today to answer queries is done in data centres. But once models get smaller, and computers/phones continue to increase in their ability to process on device, I wonder how much moves on device (and as part of the day to day, which doesn't impact much your need to charge your device daily). Interesting research question that could also form part of investment thesis. Could Apple just grab a huge chunk of this market if all the AI is running directly on iPhones?
* A lot of what Google has said about AI is that while its driving an increase in their emissions due to more data centre energy use, there is a lot of potential for AI to help in solving climate challenges, helping creating companies that are providing solutions, coming up with unique ideas, etc. A lot of doomers / degrowthers discard the possibility of immense productivity gains from AI that helps solve the climate crisis, even though to get their, it might exacerbate it a little.
Thanks so much for the comment! Your observation about on-device AI processing is spot-on. As AI models become more efficient and our devices more powerful, we could indeed see a shift towards more on-device processing. The reality is likely more complex, with a mix of on-device and cloud-based AI coexisting, each optimized for different use cases.
As AI capabilities expand, models often become more complex. While some tasks might move to devices, cutting-edge AI may still require powerful data centers.
This situation reminds me of two relevant theories: (1) Jevons Paradox suggests that as technology becomes more efficient, we tend to use it more. So, more efficient on-device AI might lead to increased AI use overall, potentially offsetting energy savings and (2) The laws of thermodynamics remind us there's no free lunch. In energy terms, more complex computations will require more energy, regardless of where they're performed.
These factors could impact how companies like Apple position themselves in the AI market, as you astutely pointed out. The balance between on-device and cloud processing will likely be a key consideration for tech companies moving forward.
Hydrogen is a battery, not a fuel. So, of course, is natural gas, but it was charged eons ago. To use hydrogen you have to generate it, which requires power.
And power is not the only hidden cost. There are also water and heat. These will likely be bigger issues.