Michael Beliore interviews them. No mention of space applications, though.
Climate science has been thrown into disarray by the hiatus, disagreement between climate model and instrumental estimates of climate sensitivity, uncertainties in carbon uptake by plants, and diverging interpretations of ocean heating (in the face of a dearth of observations). ‘Certainty’ arguably peaked at the time of the AR4 (2007); perception of uncertainty is arguably greater than any time since the FAR (1991). Yes of course we know more about the climate system than we did in 1991, but more knowledge about the complex climate systems opens up new areas of ignorance and greater uncertainty.
In context of the way climate sensitivity is defined by the IPCC, uncertainty in climate sensitivity is decreasing as errors in previous observational estimates are identified and eliminated and model estimates seem to be converging more. Climate model simulations, when compared with 21st century observations seem to be running too hot, giving creedence to the lower observation-based sensitivity values.
What do the lower values of climate sensitivity imply for policy? Well slower values of warming make it easier to adapt, and provide time to develop new technologies and new policies. But the true believers such as Mann et al. call adaptation, developing new technologies and policies as ‘inaction.’ The policy logic apparent in the essays critical of my op-ed are rather naive.
So we are left with science in disarray and naive logic regarding policy. And the ‘warm team’ wonders why people are yawning?
She should cite my piece on the precautionary principle.
Seems to have very publicly resigned from the American Physical Society:
The global warming scam…is the greatest and most successful pseudoscientific fraud I have seen in my long life as a physicist.
Come on, Prof. Don’t hold back. Tell us what you really think. Watch out for the lawsuits, though.
OK, so I installed Gnucash on my machine last week, and it worked like a charm. I rebooted over the weekend after a yum update (which included a kernel rebuild I think) and now when I try to load the program, it crashes, with this response:
157: 16 [catch #t #
In unknown file:
?: 15 [apply-smob/1 #
3597: 14 [process-use-modules (((gnucash price-quotes)))]
702: 13 [map #
3598: 12 [#
2864: 11 [resolve-interface (gnucash price-quotes) #:select ...]
2789: 10 [#
3065: 9 [try-module-autoload (gnucash price-quotes) #f]
2401: 8 [save-module-excursion #
3085: 7 [#
In unknown file:
?: 6 [primitive-load-path "gnucash/price-quotes" ...]
41: 5 [#
3597: 4 [process-use-modules (((www main)))]
702: 3 [map #
3598: 2 [#
2867: 1 [resolve-interface (www main) #:select ...]
In unknown file:
?: 0 [scm-error misc-error #f "~A ~S" ("no code for module" (www main)) #f]
ERROR: In procedure scm-error:
ERROR: no code for module (www main)
Any ideas from anyone what the problem might be? I’ve tried uninstalling/reinstalling, with no joy.
[Update a few minutes later]
Someone else seems to have the same problem, or a very similar one. I’ve emailed Mssr. Villemont.
Also, I’ve come up with a temporary fix to let me get taxes done. Skrooge seems to be able to import the data. It’s more of a personal finance app than for business, but it will let me do what I need to do until I get Gnucash fixed.
[Update a few minutes later]
Great. I can import my personal finances, but it fails when it tries to bring in the business books.
[Update a while later]
Good news. I deleted the file recommended at that page, and Gnucash seems to load properly now.
An interesting interview with Robin Hanson on brain emulation, AI, and the flaws of humanity.
The statistical meltdown:
The sensitivity of the climate to increasing concentrations of carbon dioxide is a central question in the debate on the appropriate policy response to increasing carbon dioxide in the atmosphere. Climate sensitivity and estimates of its uncertainty are key inputs into the economic models that drive cost-benefit analyses and estimates of the social cost of carbon.
Continuing to rely on climate-model warming projections based on high, model-derived values of climate sensitivity skews the cost-benefit analyses and estimates of the social cost of carbon. This can bias policy decisions. The implications of the lower values of climate sensitivity in our paper, as well as similar other recent studies, is that human-caused warming near the end of the 21st century should be less than the 2-degrees-Celsius “danger” level for all but the IPCC’s most extreme emission scenario.
That’s the wrong answer. It doesn’t justify ending capitalism.
I’ve updated yesterday’s piece at Ricochet to clarify, for those in comments. I’ve probably discussed this here before, but…
Per discussion in comments, there seems to be some confusion about the difference between high-altitude flight, suborbital flight, and orbital flight. As John Walker points out, orbital flight requires a minimum speed to sustain the orbit, but while that is necessary, it is not a sufficient condition. In fact, a flight can be suborbital with the same speed (energy) as an orbital flight. The best, or at least, most rigorous way to define a “suborbit” is an orbit that intersects the atmosphere and/or surface of the planet. So if you launched straight up at orbital velocity, it would still be a suborbit, because it would (after an hour or two, I haven’t done the math) fall back to the ground. So John’s numbers in terms of comparative energy are roughly correct for the particular vehicles being discussed here (XCOR Lynx and VG SpaceShipTwo), they can’t be generalized for any suborbital vehicle (e.g., a sounding rocket isn’t orbital, but it goes much higher than those passenger vehicles, often hundreds of kilometers in altitude).
The speed necessary to achieve orbit is partly a function of the mass of the body being orbited, but it is also a function of its diameter, and whether or not it has an atmosphere. If the earth were a point mass, an object tossed out at an altitude equivalent to the earth radius (that is ground level) would have very little velocity, but it would have a lot of potential energy. It would fall, gain speed, whip around the center and come back up to the person who had tossed it. That is, it would orbit. So even for the relatively low-energy suborbital vehicles discussed in this post, the reason that they’re not orbital is simply that the planet gets in the way.
One other interesting point is that, under the definition above, subsonic “parabolic” aircraft flights in the atmosphere, to offer half a minute or so of weightlessness (offered by the Zero G company), are suborbital flights, in terms of their trajectory. I put “parabolic” in quotes because in actuality, if properly flown, they are really elliptical sections, as all orbits and suborbits are. The parabola is just a close approximation if you assume a flat earth, which is a valid assumption for the short distances involved. Galileo did his original artillery tables based on flat earth, which is why beginning physics students model cannonball problems as parabolas, but modern long-range artillery has to account for the earth curvature, and it does calculate as elliptical trajectories.
Finally, one more extension. Ignoring the atmosphere, every artillery shell fired, every ball thrown or hit, every long jumper, every person who simply hops up into the air, is in a suborbit. The primary distinction for the vehicles discussed is that they are in a suborbit that reaches a specific altitude (at least a hundred kilometers to officially be in “space”), and leaves the atmosphere.
Clear as mud?
Can you catch it from an infected blanket?
With a bonus electron microscope picture of the virus erupting from an infected cell.
I didn’t mention this earlier in the week, but SNC is teaming with StratoLaunch to get a subscale version into orbit. If it’s 75% scale, I figure that’s about 40% of the current interior volume, which lines up with their claim of being able to carry two or three passengers (the full-scale system is designed for seven). The big advantage of such a system would be single-orbit rendezvous, and runway landing, so if it happens, there’d certainly be a market niche for it.
I hadn’t realized they’re more than just memory:
“People look at these things and see them as nothing more than storage devices,” says Caudill. “They don’t realize there’s a reprogrammable computer in their hands.”
In an earlier interview with WIRED ahead of his Black Hat talk, Berlin-based Nohl had said that he wouldn’t release the exploit code he’d developed because he considered the BadUSB vulnerability practically unpatchable. (He did, however, offer a proof-of-concept for Android devices.) To prevent USB devices’ firmware from being rewritten, their security architecture would need to be fundamentally redesigned, he argued, so that no code could be changed on the device without the unforgeable signature of the manufacturer. But he warned that even if that code-signing measure were put in place today, it could take 10 years or more to iron out the USB standard’s bugs and pull existing vulnerable devices out of circulation. “It’s unfixable for the most part,” Nohl said at the time. “But before even starting this arms race, USB sticks have to attempt security.”
Caudill says that by publishing their code, he and Wilson are hoping to start that security process. But even they hesitate to release every possible attack against USB devices. They’re working on another exploit that would invisibly inject malware into files as they are copied from a USB device to a computer. By hiding another USB-infecting function in that malware, Caudill says it would be possible to quickly spread the malicious code from any USB stick that’s connected to a PC and back to any new USB plugged into the infected computer. That two-way infection trick could potentially enable a USB-carried malware epidemic. Caudill considers that attack so dangerous that even he and Wilson are still debating whether to release it.
It’s as simplistic and stupid as thinking that CO2 is a magical control knob for the climate.
An almost book-length book review of Naomi Klein’s idiotic book.
A new paper that indicates it’s probably much lower than the models think.
…is not settled:
…the crucial, unsettled scientific question for policy is, “How will the climate change over the next century under both natural and human influences?” Answers to that question at the global and regional levels, as well as to equally complex questions of how ecosystems and human activities will be affected, should inform our choices about energy and infrastructure.
But—here’s the catch—those questions are the hardest ones to answer. They challenge, in a fundamental way, what science can tell us about future climates.
Yup. The 97% “nonsensus” is multiple strawmen, because all it ever meant, to the degree that it wasn’t just BS, was that scientists agree that there is a greenhouse effect and that therefore human-generated carbon emissions can affect climate. Beyond that, there is no consensus.
Joanne Nova has the story.
An email I just got from Amazon:
I hope this e-mail finds you well. Thank you very much for patiently waiting for my answer.
I’ve been checking regularly with our technical team on their progress with resolving the age range issue. It appears the issue is more complex than expected and we’re still working hard to get a solution for you.
I wanted to send you a quick e-mail to let you know the findings so far:
Since Amazon uses certain characters to classify books according to their content, it tends to be quite limited when it comes to character recognition. In the case of “Safe Is Not An Option,” although our platform gave you the chance to set the age ranges as 8 (min) and 18+ (max), the website is not displaying the (+) symbol because this character is generally used to determine when a book is of mature content or not. Since the book is targeted to people that are 8 and up, the system is finding a contradiction due the title being categorized as children’s, while being also set as an adult book because of the ’18+’.
We are aware indeed that what you wished to communicate is that the book was written for all people starting at age 8; even so, due to legal and international marketplace matters, the store has determined that the ‘+’ sign next to the ’18’ number makes reference exclusively to adult or erotica content, which results in a classification restriction. Due to this, the website removes the ‘+’ sign automatically and replaces it with the single ’18’ number to make your book fall within the appropriate ranges for children and adults.
We are still working to find a way to have the ‘8 – 18+’ displayed on your book’s page. Still, if by any chance the platform was unable to digest that entry, what I’d recommend to do is leave the age ranges as they are, and within the book’s description, you may clarify that the book is indeed intended for people aged 8 and up. I’ll let you know how everything goes!
I hope this information helps explain clearly the situation; I’m very sorry for how long this is taking, but I greatly appreciate your understanding!
I’ll be in touch again with an update as soon as possible.
Thanks again for your patience.
I’m kind of amazed that I’m the first person in Kindle history who wanted to show that a book could be enjoyed by children of all ages.
I’ve at least updated the book description to say that it’s suitable for all ages.
Some reflections from Judith Curry on Professor Mann’s latest court filing.
[Update early afternoon]
After being caught out claiming he was a “Nobel Prize recipient” in his original complaint (then having to retract it), it seems Mann and his lawyers just don’t have the good sense to know when to stop. In this case Mann has been “hoisted by his own petard”. His very own words condemn him. Again.
It would help if seat assignment could be made based on personal info, matching up tall with short and and some number of extra-wide seats for extra-wide people, but I’m not sure how practical that would be.
A new paper showing what BS it is. That this kind of thing continues to be repeated is why the warm mongers have no credibility.
It’s not the height, it’s the velocity.
It’s also worth noting that a suborbit can be accurately defined as an orbit that intersects the earth or its atmosphere. So even if you have orbital speed, if there’s not a sufficient horizontal component to it, you’ll still end up back on the earth before you go around.
As I noted on Twitter:
Anyone who continues to push "97%" nonsense is either pig ignorant or a lying demagogue. No other options. http://t.co/BVKTYuC3Tw
— Rand Simberg (@Rand_Simberg) September 5, 2014
Judith Curry explains:
I think we need to declare the idea of a 97% consensus among climate scientists on the issue of climate change attribution to be dead. Verheggen’s 82-90% number is more defensible, but I’ve argued that this analysis needs to be refined.
Climate science needs to be evaluated by people outside the climate community, and this is one reason why I found Kahan’s analysis to be interesting of people who scored high on the science intelligence test. And why the perspectives of scientists and engineers from other fields are important.
As I’ve argued in my paper No consensus on consensus, a manufactured consensus serves no scientific purpose and can in fact torque the science in unfortunate ways.
And José Duarte is appropriately brutal:
Yes, I’m shocked, too.
It’s not logical to state that most warming since 1950 has been caused by man (or Mann):
The glaring flaw in their logic is this. If you are trying to attribute warming over a short period, e.g. since 1980, detection requires that you explicitly consider the phasing of multidecadal natural internal variability during that period (e.g. AMO, PDO), not just the spectra over a long time period. Attribution arguments of late 20th century warming have failed to pass the detection threshold which requires accounting for the phasing of the AMO and PDO. It is typically argued that these oscillations go up and down, in net they are a wash. Maybe, but they are NOT a wash when you are considering a period of the order, or shorter than, the multidecadal time scales associated with these oscillations.
Further, in the presence of multidecadal oscillations with a nominal 60-80 yr time scale, convincing attribution requires that you can attribute the variability for more than one 60-80 yr period, preferably back to the mid 19th century. Not being able to address the attribution of change in the early 20th century to my mind precludes any highly confident attribution of change in the late 20th century.
In other words, we shouldn’t and can’t have as much confidence as many would like to push their policy agenda.
I will confess that I have done many of these things. Though I’m also often the quiet engineer who eventually speaks up.
Thought I actually do know what “Will this scale?” means. I often ask it, seriously. No one at NASA seems to understand it, though.
I didn’t mention it last week, because I’ve been busy dealing with life, but both we and National Review submitted our brief in the case to the DC Court of Appeals last Monday. I’m not sure if the CEI brief has been discussed anywhere, but here’s a discussion of National Review’s. We requested that the lower-court ruling to refuse dismissal be overturned and the case dismissed (implicitly) with prejudice. That means that if the appeals court agrees, we can go after Mann for legal costs.
Anyway, the reason I mention it now is that Alliance Defending Freedom has filed an amicus brief today on our behalf. I’ve got the filing, but haven’t seen any links to it yet. We also have one from Reason, Cato, Goldwater Institute, and the Individual Rights Foundation.
[Late evening update]
OK, we’ve got a couple more. One is from Newsmax Media, Inc., Free Beacon,LLC, The Foundation for Cultural Review, The Daily Caller, LLC, PJ Media, LLC, and The Electronic Frontier Foundation. The other is from the Reporters Committee for Freedom of the Press and twenty six other media organization, which I won’t list here.
Also, as with the last time, the District of Columbia has filed an amicus on our behalf to defend its anti-SLAPP law.
I’m guessing that a lot more media organizations are filing this time because they they were shocked at the ruling the last time, and wanted to make their views clear to the appellate court.
CEI has links to all the legal filings in the case to date, including Monday’s amici.
A great piece on the general irrationality about them, and the history. I find most interesting (and new) the point that the main benefit of posting a speed limit was not to slow the fastest down, but to speed the slowest up. More people need to understand that it is not absolute speed that is dangerous, but relative speed. When I was young, in Michigan, before Nixon’s double-nickle stupidity, the freeway signs had both a maximum and a minimum: 70/45. That was back in the days when older cars weren’t as safe or reliable at higher speeds. Today, I’d make it more like 80/60.
I’m also glad that they (as I always do) pointed out what a problem a lack of lane discipline is. If they’d give tickets for hogging the left lane, instead of speeding, traffic would flow both more smoothly and more safely.