Feels like a feedback loop. It's exciting that we're all in on AI because we're all in on AI!
Most major companies have released new versions in the past year, but if people were asked to blindly determine whether they were using the newer or older version, I suspect the results would be close to random. It seems to me that the difference between versions is sharply decreasing in a way that seems to be asymptotic, similar to what happens in literally every other domain with neural networks.
I also think it's clear that the difference between the various choices is also diminishing. Aside from certain manually designed idiosyncrasies (like ChatGPT's obsequiousness), I think people assessing which model they're using would also be mostly random. Somewhat surprisingly, even in the 'LLM arena' [1], where you get to compare output side by side, the difference between models is approaching statistical 0!
[1] - https://huggingface.co/spaces/lmarena-ai/lmarena-leaderboard
In the US, we are terrified of nuclear and the administration is trying to make economically worse energy production the norm because they are stuck in the past.
If there is an AI race, I have zero doubt that China WILL win simply because of energy production and the government's willingness to pour money in. It's a foregone conclusion at this point in my opinion because the time to build new energy sources was yesterday. The only way I see China losing is: major debt crisis or getting into a war.
There is no chance, in my opinion, that the US federal government will get off their butts suddenly to fund AI or infrastructure because they are so busy worrying about less than 1% of the population who don't affect them but that they find 'icky'.
Woke isn't destroying the US, it's the people who are busy ' "judging their neighbors' porches" meanwhile the neighborhood is burning down.
A race to what? What does the winner get? What does the loser get? What does 'winning' even fucking mean? My statistical next token predictor is better than your statistical next token predictor?
Groceries are going to get more expensive. Every dollar we spend on over building AI infrastructure is a dollar we never get back.
We could use this money for other things such as healthcare and fixing our broken parts of our education system. Instead people are getting ancy to chat with a summarized version of all the garbage on the internet.
another example could be someone wants to build an ecosystem monitoring station to monitor the nearby ravine (pollution levels with rainfall and other events etc.) and air quality over time. this is just a small datapoint but if people all over the place build their own ecosystem/weather monitoring things using basic electronics ordered from the internet and all plug them in to a standard observability software system then that could provide some pretty awesome outcomes including figuring the best way to clean polluted water (because some of the places will surely have implemented varying methods of sanitizing their own water).
“Standard observability software” whatever that is also does not require AI to build. We need 10GW to calculate rainfall for who? What benefit over how we currently calculate rainfall? This rainfall is hallucinated through summarization?
How? What are the mechanisms in which AI will lead to farms and greenhouses being more productive? How will AI improve the existing automation that already exists for the farming sector, and has existed for a hundred years?
electronics recycling, disassembling old computers to get the raw materials into a form that can be used again. we'll need programs to automate the production and testing and analysis of the robots that will recycle the components.
That much is obvious. The fact that you’re straining so hard to come up with these bongcloud “ideas” should clue you in that maybe this isn’t the revolutionary tech that the suits are selling it as
How are cost of living increases tied to AI investment?
Where do you think the tax money being given away to these huge datacenters is going?
What downstream effects does major increase to limited consumer services and goods do across the board?
as it was said in 1898 by Hilaire Belloc: "Whatever happens, we have got. The Maxim gun, and they have not."
In that case it was the British who had just slaughtered a lot of Matabele in Africa.
> Losing the AI race is an existential threat to our civilization.
Lol. If winning it means a sea of hallucinated "factual" text and deepfake videos, then that death of truth is also a threat. Being rid of that is no threat at all.
there is no evidence of this
USA already bet on software when it let China overtake manufacturing.
Now the only worse thing to betting on AI would be slowing it down inside USA.