|Stephen Hawking: Exhibit A on what|
constitutes false appeal to authority.
His latest? We're about to wipe ourselves out.
Maybe we are, maybe we're not, but he brings no special scientific credence to the table on the issue, unless there's a previously undiscovered black hole hurtling toward earth, and even then, that would be the black hole wiping us out, not we ourselves.
Look, he may be right.
But, that's not because he says so, it's because the most pessimistic climate scientists turn out to be right, or the most pessimistic students of nuclear weaponry turn out to be right.
On the former, while I'm no Pollyanna, I'm no James Howard Kunstler, either. On the latter, while I'm no modern day Atoms for Peace type, I think the world, by slow fits and starts, continues to become slightly less nuclearized.
Back to Hawking.
He's been wrong in something halfway approaching his sphere of expertise before, which further undercuts his credibility outside that sphere.
that aliens could wipe us out. Given that we've not discovered any evidence of such aliens so far, and given that, IMO, many touters of SETI have overestimated the values of most variables in the Drake equations, may have psychological reasons for so doing, especially in America, and are deservedly spoofed, I highly doubt his claims.
Of course, his answer has been manned colonization elsewhere.
|Per the commenter below, if use of sarcasm is an ad hominem|
then we've opened the French window Overton Window.
Second, given that you've been wrong about manned travel to Mars, overlooking the dangers of cosmic radiation, and other costs, your credibility in this whole area is kind of shot.
I'll grant a slightly higher likelihood Hawking is right about high-tech robots, but, to the degree I do, it's not because of his say-so, it's the say-so of experts in the field.
Back to the main point.
False appeal to authority is just one of a couple of dozen fallacies of informal logic. I encourage people to bookmark that link.
Update: I'm going to tackle the aliens part in further depth.
First, by when our first big radio telescopes were built, we know pretty well that there's no intelligent life within 100 light years of Earth.
Second, alien civilizations advanced enough to travel 100 light years or more probably wouldn't even bother wasting their time on us.
Third, if they did want to wipe us out, they'd have Star Trek-type cloaking devices or something. We'd never know what hit us.
Update 2: Per the second round of comments, I've tagged this post with the tag "salvific technologism." To explain, that's the idea that technology will always be the cavalry, always riding over the hill to save human nature. Having seen the massive problems with importing one invasive species to battle another, the idea that humans would go beyond that, the idea that human control of biology will always save us (it won't always fail us, though, and this should not be read as something anti-GMO), to the idea that human technological engineering will always save us, like in geoengineering against climate change, or blasting off to another planet after wrecking this one, is appalling, as well as being about as likely to be true as biological controls I mentioned at the start of the sentence.
And, for either of the commenters, or others, who have not read through my blog otherwise, those thoughts aren't just mine. I strongly suggest reading the likes of Evgeny Morozov.