It’s really cool to receive fan mail…and this is the best one yet. It’s thrilling to hear that someone really really gets what you’re trying to say, and then is interested enough that the thoughts you were trying to convey take on a new life.
Alec Muller writes (note: contains spoilers):
At the end of Book 2, I think you hint that there will be trade between Earth and Mars, and that Mike wants to get in on the ground floor of it with an ‘Interplanetary Venice’ – a city that zips back and forth between the two planets to capitalize on trade. It’s more of a Venice than a Hong Kong in my mind because it needs to be militarily self sufficient; if it’s not armed to the teeth, it will be too tempting a target for Earth bureaucrats. I’m assuming Earth will continue to be dominated by self-serving bureaucrats.
Earth will want technology (but only a tiny, heavily-sanitized cross-section of it) from martians, because Earth is already an economic backwater, and the contrast will be greater the more martian societies expand in a bureaucrat-free environment. Construction equipment (assuming the California earthquake recovery is still a huge drain on the economy), food-growing equipment, and mining equipment will be valuable to people (even bureaucrats) on Earth.
Paradoxically, Earth has nearly as much matter as the combined mass of Mercury, Venus, Mars, the Belt, and all of the moons of Jupiter & Saturn (but not the gas giants themselves, obviously), so expat mining technology is a potentially enormous benefit to people living on Earth. Surely the government will claim credit for the technologies and tout them as examples of the success of central planning while censoring their true origin. I imagine expat firms will probably open-source the dumb heavy stuff (steel plating and I-beams) while selling smart light stuff (control electronics) that’s more economical to ship. Martian companies might even use dedicated cryptographically-controlled fabrication facilities on Mike’s Venice to make trade goods to sell to Earth.
What will people on Mars be willing to pay for in exchange? Not much, because a) they’re already highly self-sufficient and will become more-so now that their population is higher, and b) shipping costs (especially the many-month delay) will literally be exorbitant. I think the biggest thing martians will want is immigrants, which they’ll see as both an increased customer base, and genetic diversity (to make Mars stronger than it would be otherwise). This is true for humans, but doubly-true for Dogs, who currently only have a population of 500 or so, which is a scary genetic bottleneck for any species. An influx of non-uplifted dogs and wolves will be a huge asset to them. There’s a humanitarian aspect to migration for both dogs & humans: many (although not all) martians will feel compelled to save their brethren from the slavery and oppression of Earth. Bureaucrats might also want to send their ‘undesirables’ to Mars just to be rid of them.
On a related note, I hope we’ll learn more about the Exodus, which caught me off guard. The population of Aristillus strained to grow from 100,000 to 150,000 through the first 90% of book 2 with the mass immigration of the open source ships, then suddenly grew 5x more to ‘nearly 1 million people’ with the Exodus. That must have either brought an extreme level of privation, or migrants of the Exodus brought more life support infrastructure with them, or humans, Dogs, and Gamma must have figured out a way to expand living space *very *quickly.
Gamma is the most fascinating character in the books for me. We don’t know a ton about his personality & motivations yet, but he seems highly conscientious while having a strong sense of self-preservation. He thinks very quickly, and is likely wealthier / more powerful than the Dogs or Darren Hollins (as evidenced by his ability to move the entire city). The most telling statement about him is that he’s ‘already reached the limits of his cognition’.
As fast and powerful as he is, he hasn’t been able to figure out a satisfactory workaround for splintered cognition. It looks like he’s struggling (and failing) to find a working set of principles for individual rights as they apply to AIs. He clearly cares about intelligent life as he goes out of his way to save John, the Dogs, and the extraction-strike ship (when he *definitely* could have survived without them) in addition to the entire human and Dog population of the city (which he presumably thinks improve his chances for survival).
But on the other hand, he’s perfectly willing to use lethal force against splinters of his own mind who refuse to re-integrate. I think he sees it as the right to perform brain surgery on himself (“It’s *my* mind – if destroying a cancerous portion of it makes me stronger, then I have every right to do it.”), while the splinter versions see themselves as separate entities that have just as much a right to peacefully exist as he does. In some ways it’s analogous to the conflict between individualists & statists: the statists say, “SOCIETY is everyone who lives in this box we’ve drawn, WE are the duly-chosen people who decide what’s best for society, and we’ve decided you’re in our box so we have every right to force you to comply with our dictates,” while individualists say, “Take your box and leave us the fuck alone.”
Gamma could accept that splinters of him have a right to live independently of him, but it’s a risk because it effectively kicks off the race to a ‘hard take-off Singularity’ as you put it. If there’s only one of him (or one dominant with hundreds of lower-tier versions), he can afford to be cautious. The moment there are 2 or more with equal rights, though, they’re competing with each other, and there’s an incentive for each of them to take risks (for example, give more permissions to lower-tier versions that might then escape, or making performance-enhancing tweaks to their cognition that could have unintended consequences) to compete more effectively against their rivals.
I think Gamma is either a) is exercising extreme caution because he’s even more afraid of the consequences of the Singularity than humans & Dogs are, b) isn’t smart enough to come up with an ethics framework that will ensure peaceful coexistence between himself, other AIs, and other intelligent creatures like humans and Dogs, or c) some combination of the two. It’s worth pointing out that Gamma could do everything he did in the books without being terribly smart – just *extremely *fast and conscientious. The fact that John was able to use Gamma’s precise use of language and (presumed) avoidance of lying to tease intel from him supports the hypothesis that Gamma has a lower IQ than Dogs or humans – or at least a less-robust “Theory of Mind”.
Speaking of ethics, I have some ideas on AI ethics that were influenced by Ramez Naam’s Nexus trilogy (which I highly recommend). Naam’s idea is that when multiple AI exist (or could exist) in the world, none of them have any guarantee they’ll forever-remain the most powerful one, so they have an incentive to respect the rights of lesser-intelligences or risk retribution from one or many smarter, more powerful AIs. This in turn applies to those stronger AIs too (who have no reason to believe they’ll remain the strongest) and recursively on up.
My twist on that (which I can explain in more detail if you’re interested but won’t do here) is that a culture made of individuals spanning many different cognition-ability-levels (for example: smart AI, dumb AI, smart adult human, retarded adult human, uplifted Dog, non-uplifted dog, smart child human, etc) will optimally a) apply the same ethics rules for individuals within a level, but b) apply different rules to individuals on different levels, and c) apply more rules – particularly restrictions on interactions with lesser-ability individuals – to individuals on higher cognition levels. It’s like a book with 20 chapters, and non-uplifted dogs are expected to understand and abide by chapter 1, “Don’t bite people except in defense of your pack”, while a smart child and a retarded adult might need to understand up through chapter 4, a smart adult human through chapter 14, and a smart mature AI up through chapter 20 (which, by the way, would be completely incomprehensible to the smart human).
I’m curious to find out what Gamma will do with lesser-tier copies of himself on other worlds. Will he start developing Jupiter (the obvious destination for an AI, with all its exploitable natural resources) with heavily-constricted copies of himself, or is that too risky with the multi-hour light speed lag?
The Dogs are awesome. They’re held back by their small population and poor dexterity, but I get the impression they’re substantially smarter than the average human. I’d love to know more about their culture and structure (but it looks like they’re still figuring it out themselves). Their high intelligence and fast time-to-sexual maturity (10x faster than humans) put them in a position to *dominate *life in the Solar System – at least until AI eclipses them. Will the religious and non-religious Dogs maintain close ties, or go separate ways? How does anarchy (or heaven forbid, government) fit into the Dogs’ pack structure? What’s Dunbar’s Number for an uplifted Dog?
What kind of splintering will happen among humans when they get to Mars? In a sense, Mike is already preparing to splinter by planning to build his trade city/ship. I’d also expect Mark Soldner and friends to splinter off to form a government that charges taxes and restricts some set of liberties. While I appreciate Mike’s effort to stop Mark from imposing a government on all of them, I have a hard time seeing him resorting to violence to stamp out a government that restricts itself to a particular geographic area and only claims authority over people who explicitly agree to it. Will others who escaped government on Earth, but may not fully-embrace anarchy, insist on setting up governments too? If 100,000 people could be effectively self-sufficient, can 1,000,000 people be effectively self-sufficient even after fragmenting into 10 separate settlements? Will Mike get his start trading between settlements before trading between planets? Will all the humans and Dogs stay on Mars when the Jovian moons beckon too?