Ten Grand Technical Challenges of the 21st century: 1. Settle Space, Mars, moon, asteroids, the solar system.
Settle Space, Mars, moon, asteroids, the solar system.
The first of the Ten Grand Technical Challenges of the 21st century
century. The things that make us proud to be a part of the human race. The things that we should be
doing regardless of what else is true, the things that make life have meaning.
out into space for the last 50 years, it's obvious we have some innate need to expand into new
environments. People have been in space almost continuously since 1971 (that'd be ~50 years) and I
detailed the history of the technology in this post  about how technology changes things.
those closer to home so far ) but they are certainly robots. Eventually we'll be sending out ones that
can reproduce. If we will, so would the aliens. It doesn't take too long before these robots infect the
galaxy. But mostly what they do is just watch. We've been transmitting TV signals for decades, but that
means we've only announced ourselves to 100 cubic light years whereas the volume of the galaxy is
100x10^15 cubic light years. How much of the resources of a galactic civilization would be able to
focus on earth? Not very much. The correct comparison is if one atom in your body all of a sudden
started sending out electrical signals, would you notice? I wouldn't.
noticed us. It's hard to miss all the hydrogen bombs we've set off since 1950. But still, even if they did
notice, it will take a long long time for them to get here. If I could only live that long  it would be
amazing! The only counter theory is that the robots that are watching us would now be growing aliens
from the DNA equivalent they use and operate independently. If the speed of light is really the limit on
transmitting information, living organisms as large as the galaxy are going to be very slowly acting.
For a signal to propagate to half of the galaxy takes 50,000 years. Since our recorded history is less
than 5,000 years we have no practical experience of what a society like this would be. But enough of
the wishful thinking. What are we going to do before the killer robots get here?
measured in time. Time is money, but money and time is what people's lives are made up of. People's
lives compared to the time scale of a galactic society is about 1000 to 1. Seems like they would barely
notice each other. So for now, we will ignore any effects that the alien killer robots would have
(although in my opinion, the alien killer robots are already here.) 
habitable areas they are going to evolve independently of each other. So we need to learn how to
guide their evolution in the right directions.
efficient way to do this ? I' m a big fan of going everywhere as fast as possible. So, I'm a big fan of the
space station. We need to keep one manned forever. After the space station (which is the most
expensive structure ever built) which has been in orbit for almost 50 years, we need a base on
another planet or moon. Where should it be?
a colony, and the easiest place to do that is on the moon. It's nearby, easy to deliver food and shelter
to, easy to rescue, cheaper and faster to get to than Mars. It will give us the experience we need to
learn how to inhabit other worlds. It's definitely going to be cheaper than going all the way to Mars.
Except, water is harder to come by on the moon. We'll want to be at the south pole where some water
has been found in crater shadows. We can learn how to make rocket fuel from the existing landscape,
which is critical to making a Mars colony work. If it takes us awhile to make this happen, it won't
matter on the moon as we can go fly rocket fuel to the moon, whereas getting it to Mars is extremely
gives us the chance to learn how to settle humans on the first body outside of the earth and helps us
get to Mars with more confidence. We should do both, but we can start going to the moon right now.
We don't have to build a new rocket, we can use the one's we have. This is a huge advantage. We've
already settled space (the international space station) and now we need to settle moon and learn how
to settle Mars and the asteroids, of course.
Centauri, a mere 1.3 parsecs, or 4.22 light years, away."
some billionaire is helping to finance it: Starshot  and a list of the technology they need to develop:
mostly lightweight chip packaging (they're essentially sending a cell phone into space) and some
software and materials work on the lightsail and the laser to control the actual sailing. They've already
produced a system design of how to get to Alpha Centauri, published in the British Journal of
Interplanetary Society. 
space ships? Baby farms? Teleportation? All the current ideas sound ridiculous. But just wait, it's only
a matter of time and another blog post.
revolutionize economics. A way to fairly trade between realms. Realms can publish 'magic' or
technology (steps to produce something) and that something can allow a conscious entity to perform
a particular skill. A skill allows the user to cast a 'spell', a set of defined steps (algorithm) to produce a
given output. So time to output. What are examples of outputs? AIs. Cars. Rockets. Planes.
Spaceships. Energy. Nuclear power. Would we trade the hydrogen bomb around? If you don't know
how to do that, I don't see how you'd be able to listen to the galactic internet. Another blog post is due
on how you make it profitable to run an intergalactic internet? And what would you use that network
for, given the 'power' of the network is limited by the huge latencies in the galaxy: 50,000 years or
1.7x10^12 seconds. 1700 trillion seconds.
network is defined as the amount of information it can hold at one time. The throughput is how much
comes in or out of this repository of information, the power is how much is stored in the network, by
stored we mean information that is in transit, that is information that has been sent but not yet
received. Assuming a data rate as high as we can conceive, given physics and science. What are the
limiting factors? There are some distortions in time and phase and frequency that limit the data rates
you can transmit per hertz. And there's the fundamental limit that Claude Shannon determined. 
time a customer spends in service and t as the average time between customer arrivals. Often we use
the following rate notation for these quantities: x = 1/µ (where µ is the service rate) and t = 1/λ (where
λ is the arrival rate). Further we combine these two quantities and define ρ = x/t = λ/µ as the system
efficiency (also referred to as the utilization factor); in
latency due to speed of light, but assume such latency is included in the service time.
You should be able to figure this out. The human race spends a lot of effort watching the skies. A lot.
I wonder if it's built into the genes?
produce an average saving in time of a procedure of x% per period. Or efficiency service time
increase of x% per period. It takes xxx bits to do this. We receive them at a rate of multiple Tb/s per
channel and there are one Trillion channels (every star in the galaxy.) And a technology requires
about a TB of information. That's enough to build a human being (take my word for it. Includes DNA
rated in savings percentage per unit time. On the receiving side you have to multiply the proposed
solution by the percentage of it that is applicable to the existing situation, which will be less than 100
percent. There's always some friction in remote transactions. This is all formalized, in a paper by Paul
Krugman: "The theory of Interstellar Trade." 
hydrogen bomb: 1950 according to the man who could predict something from nothing  there's a
95% chance of an extinction event occurring between 800 and 8 million years from now.
read the underlying books about the Berserkers by Robert Silverberg.
that they aren't fundamentally related. They actually may be directly related... in that the uncertainty
principle sets the accuracy of measuring two variables simultaneously. Shannon's law sets the
amount of power required to transmit information reliably. To transmit information reliably you have
measure bits of signals accurately, this defines how much power you need to differentiate two
different signals. The detection is typically gaussianly distributed which requires the sources to be
separated by more than half their width to be able to be differentiated. This is the same requirement
made by quantum mechanics. This is the minimum noise that you will see in any system. This
quantum noise is unavoidable in the real world. It can't be edited out or corrected without sacrificing
latency. You can devise a code to run over a channel that will guarantee the data will be delivered
correctly but adds latency. According to Shannon's law the additional latency for additional accuracy
is exponential over polynomial? [XXX???} this needs filled out. Is it linear? What is the best you can