If AI produces better Einsteins, etc -- does the argument really hold that humans need to keep having kids for civilization to continue to prosper?
pull down to refresh
0 new comment
35 sats \ 26 replies \ @Undisciplined 19 May 2024
I don't think we need population growth for humanity to continue prospering as it is.
As I see it, the issue is more that the purpose of prosperity is to enable human flourishing. Less people, means less flourishing (other things equal).
reply
0 new comment
21 sats \ 25 replies \ @Satosora 20 May 2024
Do you think the world will be a better place with less people?
reply
0 new comment
10 sats \ 2 replies \ @Undisciplined 20 May 2024
Not in general. It would be a better place without a small number of particular people, though.
reply
0 new comment
21 sats \ 1 reply \ @Satosora 20 May 2024
That is true.
The politicians, perhaps?
reply
0 new comment
10 sats \ 0 replies \ @Undisciplined 20 May 2024
Perhaps
reply
0 new comment
0 sats \ 21 replies \ @ZezzebbulTheMysterious 20 May 2024
Yes. We shall find out next year after ww3 breaks out. It will be much worse for a time, and then better, for the survivors.
reply
0 new comment
0 sats \ 20 replies \ @Satosora 20 May 2024
No way WW3 is happening.
Its just a small skirmish.
reply
0 new comment
31 sats \ 19 replies \ @ZezzebbulTheMysterious 20 May 2024
I’m afraid it’s more likely that you believe. We can only hope it’s a minor conflict. But no, it’s all looking more like a global conflict between bifurcated economies that will take out a large slice of the world population.
Which is sad because we just got through a “once in a generation” pandemic only blow ourselves up with the weapons of war.
History repeats itself in cycles.
reply
0 new comment
10 sats \ 12 replies \ @Undisciplined 20 May 2024
Not only do I agree that WW III is likely, I think we’re probably already in it.
Remember that the date given in hindsight for both previous world wars is long before anyone thought they were in such a conflict.
view all 12 replies
0 sats \ 5 replies \ @Satosora 20 May 2024
WW3 where?
Ukraine vs Russia?
Israel?
view all 5 replies
40 sats \ 0 replies \ @grayruby 19 May 2024
Not if AI can solve longevity and humans can live 200 years.
reply
0 new comment
126 sats \ 2 replies \ @rizzling 19 May 2024
This is not possible, you have to look at that what we call 'AI'.
The word intelligence in that context is not equivalent to the context of intelligence of mammals.
There were some discussions or suggestions to call these models not AI but PP (probability programs). And AI should only be used in the context of a real AGI (artifical general intelligence).
Furthermore, it is absolutely unclear whether it is even possible to develop AGIs. There are no programmers who can speak in this area without having the same in-depth knowledge in the areas of psychology and biochemistry. At the same time, a psychology and/or biochemist needs appropriate programming skills, although it actually goes far beyond simple programming to be precise.
Unfortunately, there are very few people like that in the world. It remains to be seen what results current research will deliver over time.
reply
0 new comment
31 sats \ 1 reply \ @freetx 19 May 2024
Yes, I think we need "modified turing test". Sure, it can be difficult or impossible to tell the difference between a LLM and human, however, that LLM was seeded with conversation from conscious / intelligent humans, thus it renders the test invalid.
However all this is a moot point. Humans (esp since giving up God), love worshiping the various idols they create....therefore its conceivable a huge part of humanity decides LLMs are in fact "intelligent" and even "conscious" and treats them accordingly.
reply
0 new comment
0 sats \ 0 replies \ @rizzling 19 May 2024
I will try to find any prompts to let gpt-4o reveal that its just code or even sending it in a loop
reply
0 new comment
0 sats \ 0 replies \ @Satosora 20 May 2024
Something is wrong with the world if it is becoming too costly to have kids.
reply
0 new comment
0 sats \ 0 replies \ @TheBTCManual 19 May 2024
I know its a bit scifi out there, and i'm not sure this Ai we see now will get to this point
But sometimes I think that if Ai does get to the general AI point it's actually the collective thinking of humanity, and it extends it beyond the limitations of the biological, if we're merely data, we can dematerialse the human experience and move beyond the current limitations.
We could transport that data to new worlds, reform it there, start again, to expand the pursuit of humanity to a wider universe with more resources to keep on exploring
reply
0 new comment
0 sats \ 0 replies \ @k00b 19 May 2024
This hinges on a good definition of civilization level prosperity.
I think humans will be happier in the company of their own kind, prospering with their own kind, but civilization prospering might not require humans be maximally happy.
reply
0 new comment