Humanity's technology tries to match in power what was grown over eons by nature
Energy
For the past 200 years, human society has exploded with innovation, growth and activity in so many ways. The human population has grown from 983 million in 1800 to 8 billion in 2022.
We've solved the problem of distance with motorized transport such as cars, trains and airplanes. In 2023 I flew with my family to New Zealand for a vacation. It tooks us roughly 1 day by airplane (we rested a couple of days in Singapore, but pure airtime was around 25 hours). In 1826 this journey would have to be done by sailing ship, which would take 4 to 6 months (120 to 180) days. Averaging this we could say that the journey is now 150 times faster. It no longer takes a significant part of a year, but at most a few days.
Furthermore, we've solved the problem of time with a communication network spanning the entire planet. Sending mail 200 years ago took about the same time as traveling. Today, I can instantly get someone from the other side of the world on the phone or even see them live in a video meeting.
Without getting into too much detail right away: it is reasonable to point to the enormous amount of energy afforded to us by fossil fuels as the driver behind much of this growth. We might take much of our fossil fuel energy for granted nowadays. The novelty of traveling at speed in a car has long since faded and been replaced by annoyance at traffic conditions. But as a species on this planet we've been incredibly lucky to have such a reservoir of explosive energy waiting for us to discover and use. Reading information about the formation of oil on the website of the University of Calgary really puts a perspective on this. The conditions for the formation of oil are incredibly fragile. We're lucky they happened at all. And the whole process required the immense forces of our Earths gravity and plate tectonics, working continuously over hundreds of millions of years.
The current energy transition efforts make sense. We will eventually run out of oil, coal and gas. If that happens while our civilization has no alternative, we will simply implode in the most horrible way imaginable. But I'm not a pessimist and certainly not a gloomy thinker preaching doom. We're well on our way to move to renewables. But to really get your enthousiasm going, let's compare our energy transition project to the creation of fossil fuels.
In a few decades our intelligence, cooperation and technological prowess is creating the power delivery for a massive civilization that took the forces of nature and gravity hundreds of millions of years to create. Think on that for a second. What took all of nature and our planet's massive gravity and plate tectonics hundreds of millions of years, we will do in 30, 40, maybe 50 years. Now it's not cool to brag in my opinion. But in my eyes it is a legitimate reason to be excited about being part of Humanity and to look at our achievements and ambitions in awe and with positivity. This just simply rocks.
Artificial intelligence
Artificial intelligence is another grand Human project that can benefit from putting it into a broader perspective. However, it differs from the energy project in that it is not a solved problem. Even though we've made great strides in genAI, it's still just a next word predictor and not true reasoning.
Comparing our attempt at creating intelligence to what nature has done, a similar picture in terms of time scales emerges: Our intelligence was shaped mainly by evolution and it took nature hundreds of millions of years to arrive at human level intelligence. Once again, our own efforts seek to reproduce this pinnacle in a couple of decades.
In order to build technological intelligence we've taken nature's biological neuron and created a digital variant called a perceptron. And mirroring the brains nature creates, we create and train neural networks. To further our efforts it could be beneficial to examine the types of training involved, along with their respective volumes of data and timescales. Generative artificial intelligence using large language models trains deep layers of perceptrons on the largest data set humanity has at it's digital disposal: all text on the internet. I tend to compare this to the training of the brain of a single human being. So the training of an LLM can be compared to training a brain in a single lifetime. This leaves a huge area of training out of scope: the training our entire body-mind system received from hundreds of millions of years of evolution. There is a class of known algorithms called Genetic Algorithms. They implement an evolutionary model for finding a solution to a certain problem. I think conceptually, there is room for improvement in AI by implementing a new form of genetic algorithms on top of LLM neural training. It is reasonable to say we have been able to condense the time it takes to train a human being (20-40 years) to just a few months in training an LLM.
What can we achieve in terms of intelligence when, on top of LLM's, we set out to condense the evolutionary training that takes hundreds of millions of years into maybe 2 to 5 years? This could be the missing step in achieving true AGI that understands the world in actual cause-effect terms.