Oh, you mean some super intelligent 'pods' who will govern humans in future and make them irrelevant.
I don't see this happening. Humans ae super smart. They have accepted and let many things lose but they had never and will never allow machines to govern them.
Unfortunately, human intelligence is steadily declining. It continues in cycles, upward and downward. See Edward Dutton, At Our Wits End for the case for that.
To the main point, humans have now started to manipulate their genes. Thus, they will inevitably begin to select them. Since genes are massively complicated and contain an overwhelming amount of information, humans will begin to use computers help them make these selections. Over many generations of a sort of genetic 'arms race' will ensue. Those who use computers will outpace those who don't.
Then it's a matter of who has the best computers / algorithms, which will, as a result, continuously improve. Eventually, humans will depend upon the computers to select their genes. The long path of this dangerous trend, is the computers become very advanced and there will be a kind tango between 'humans' and computers. The arms race will lead to the situation where computers that control humans the best will be the most powerful. On and on until neither of them look the same as they did before.
That's the danger humans have over the next many millennia. Not sure it can be avoided. The temptation to edit genes will be not easy to avoid en masse.
reply
Unless, of course, something destroys human progress first.
reply
Agree on most parts except for 'tango between us and computers'.
Computers will always be servant to men because what separates humans from objects also applies to computers.
The idea of computer or any machine taking over the world is predicated on the assumption that they would have the motive and desire to do so. However, machines are not sentient beings, and they lack the capacity for self-awareness and emotions. This means that machines do not have the desire to take over the world or any other motive, for that matter.
reply