Category Archives: Technology and Society

Frying A Turkey Without Oil?

Not exactly, despite the claim of this post:

Deep frying is a form of convection heating. Instead of hot air, you are using hot oil to transfer the heat. Depending on the oil used in the fryer, the temperature is usually about 375 degrees to keep the food from absorbing a lot of oil.

The Big Easy uses infrared energy to “bathe” food. It excites the proteins, not the water. Thus, you are literally frying it. It’s just like sitting in the sun all day. The infrared energy will “fry” your meat’s skin. The Big Easy doesn’t need a lid because it’s better to let the hot air escape. That way your food doesn’t dry out and there’s no basting necessary. Unlike conventional turkey fryers there is also no warm-up period. Just drop your thawed turkey (stuffed or unstuffed, injected or not, sugar-less rubbed or not) into the chamber and turn the Big Easy on. Infrared energy starts cooking it immediately and the cooking time for 12-14-pound turkey will be cut almost in half.

Without expressing an opinion on the relative merits of cooking a turkey this way, it’s not equivalent to deep-fat frying. As it says, it only radiates the skin, whereas a deep fryer gets hot oil inside the bird as well, which has to speed up the cooking time considerably. And if the oil is sufficiently hot, there’s no reason that it has to make the bird greasy, or any more so than it would be naturally from its own fat.

The Big Easy™ is $165 at Amazon, whereas serviceable friers are available for less than half the price. Of course, with the former, you don’t need any oil, which might save you ten bucks or so per turkey preparation, so it might pay for itself over time if you do a lot of turkeys. But considering the time value of money, I think that you’d have to be a real turkey fan to make up the difference. Of course, it might be good for other meats as well.

[Update late evening]

Contrary to Glenn’s comment, I don’t call “foul.” The proper spelling is “fowl.”

A Kludge

Is this the future of air travel?

Engineers created the A2 with the failures of its doomed supersonic predecessor, the Concorde, very much in mind. Reaction Engines’s technical director, Richard Varvill, and his colleagues believe that the Concorde was phased out because of a couple major limitations. First, it couldn’t fly far enough. “The range was inadequate to do trans-Pacific routes, which is where a lot of the potential market is thought to be for a supersonic transport,” Varvill explains. Second, the Concorde’s engines were efficient only at its Mach-2 cruising speed, which meant that when it was poking along overland at Mach 0.9 to avoid producing sonic booms, it got horrible gas mileage. “The [A2] engine has two modes because we’re very conscious of the Concorde experience,” he says.

Those two modes–a combination of turbojet and ramjet propulsion systems–would both make the A2 efficient at slower speeds and give it incredible speed capabilities. (Engineers didn’t include windows in the design because only space-shuttle windows, which are too heavy for use in an airliner, can withstand the heat the A2 would encounter.) In the A2’s first mode, its four Scimitar engines send incoming air through bypass ducts to turbines. These turbines produce thrust much like today’s conventional jet engines–by using the turbine to compress incoming air and then mixing it with fuel to achieve combustion–and that’s enough to get the jet in the air and up to Mach 2.5. Once it reaches Mach 2.5, the A2 switches into its second mode and does the job it was built for. Incoming air is rerouted directly to the engine’s core. Now that the plane is traveling at supersonic speed, the air gets rammed through the engine with enough pressure to sustain combustion at speeds of up to Mach 5.

A combination turbofan/ramjet. Hokay.

If I understand this properly, the idea is to fly fast subsonic over land to avoid breaking windows, and then to go like a bat out of hell over the water. When I look at that design, I have to wonder how they can really get the range, with all of the drag that is implied from those huge delta wings, not to mention the wave drag at Mach 5. I also wonder where they put the hydrogen–that stuff is very fluffy, and needs large tanks. It’s probably not wet wing (it would be very structurally inefficient), which is why the fuselage must be so huge, to provide enough volume in there for it.

Sorry, but I don’t think that this will be economically viable. As is discussed in comments and the article, hydrogen is not an energy source–it’s an energy storage method, and it’s unclear how they’ll generate it without a greenhouse footprint. Moreover, it’s not as “green” as claimed, because dihydrogen monoxide itself is a greenhouse gas. I’ll bet that this thing has to fly at sixty thousand feet or more to get itself sufficiently out of the atmosphere to mitigate the drag problem, and that’s not a place where you want to be injecting a lot of water.

This concept doesn’t learn the true lessons of Concorde: like Shuttle, a lot of people have learned lessons from Concorde, but the wrong ones. The correct lesson is that we need to get rid of shock waves and drag. Once we do that, we’ll be able to cruise at reasonable speeds (say, Mach 2.5) everywhere, over both land and water, so we won’t have to build the vehicle out of exotic materials and eliminate windows. We’ll also be able to have fast transcontinental trips (two hours coast to coast) which is another huge market that this concept doesn’t address at all. Finally, it has to do it with a reasonable lift/drag ratio, so that ticket prices will be affordable. And I think that the fuel issue is superfluous–Jet A will be just fine for the planet, as long as fuel consumption is reasonable, which makes the vehicle design much easier, with much more dense fuel.

Fortunately, I’ve been working for over a decade with a company that thinks it knows how to do this, and I’m hoping that we’ll be able to start to move forward on it very soon.

[Via Clark Lindsey]

[Update in the late afternoon]

In response to the question in comments, there’s not much publicly available on the web about shock-free supersonics, but here’s a piece I wrote a few years ago on the subject.

Caulking The Ship

It’s about time. Firefox is finally fixing all of its memory leaks:

No matter the reason or the timing, Mozilla claims progress on the memory front. In its release notes, the company trumpeted the fact that the just-released Beta 3 plugged more than 350 leaks, with over 50 stopped in the last eight weeks alone.

“We’ve made a lot of progress,” said Schroepfer. “Our memory usage is significantly improved, and dramatically better than [Microsoft’s] Internet Explorer 7.”

But the work’s not finished. “Most of the big memory issues are resolved, and we’re seeing some pretty good numbers [on memory consumption], but some additional [work] is one reason why we felt we needed Beta 4.”

That’s been one of my biggest complaints about Firefox. At any given time, I may have forty or more tabs open, and the memory usage would get to the point where the machine was paging so much to disk that it would just be brought to its knees, and I’d have to kill Firefox to recover the memory.

But I also have to say that since I upgraded my RAM from one to two gigs (on a Windows 2000 machine) the problem has largely gone away. For anyone who’s unaware (and particularly now that memory prices are plummeting), the cheapest thing you can do to improve your computer’s performance (dramatically, in my case) is to give it lots of memory.

But I may go get the beta version of Firefox 3 anyway.

Never Mind

There was a little bit of a buzz in the blogosphere a few days ago about Gizmodo’s report that the Japanese plan to bombard the planet with frickin’ laser beams from outer space. No word on whether or not they would be attached to the heads of sharks.

I thought it was a little strange, myself. While lasers have been proposed for space solar power, most of the concepts over the past four decades (ed– wow, it’s really been four decades since Peter Glaser came up with the idea? Yup) have been to transmit the power with microwave beams. Lasers (probably free-electron lasers with tuned frequencies) have the advantage of higher power density (and thus less need for large ground receivers). But they don’t penetrate the atmosphere and clouds as well, and they are less efficient for power conversion. Also, they raise exactly the fear described in the Gizmodo piece–that higher power density is a double-edged sword. Microwaves are preferred because the energy conversion efficiency is very high, and the beam density is less than that of sunlight (it’s better than sunlight despite this, because the beam is available 24/7 and the conversion efficiency is much better, at least with current solar cell technology). It’s much more difficult to weaponize, by the nature of the technology.

Anyhoo, I’m assuming that what was actually being referred to was this:

On February 20, JAXA will take a step closer to the goal when they begin testing a microwave power transmission system designed to beam the power from the satellites to Earth. In a series of experiments to be conducted at the Taiki Multi-Purpose Aerospace Park in Hokkaido, the researchers will use a 2.4-meter-diameter transmission antenna to send a microwave beam over 50 meters to a rectenna (rectifying antenna) that converts the microwave energy into electricity and powers a household heater. The researchers expect these initial tests to provide valuable engineering data that will pave the way for JAXA to build larger, more powerful systems.

Microwaves, not lasers, as Gizmodo mistakenly claimed. The article does mention lasers as a potential means of getting the power down, but that’s not what next Wednesday’s test is about.

More Stem Cell Advances

This stuff really is moving along at a good clip:

“Our reprogrammed human skin cells were virtually indistinguishable from human embryonic stem cells,” said Plath, an assistant professor of biological chemistry, a researcher with the Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research and lead author of the study. “Our findings are an important step towards manipulating differentiated human cells to generate an unlimited supply of patient specific pluripotent stem cells. We are very excited about the potential implications.”

The UCLA work was completed at about the same time the Yamanaka and Thomson reports were published. Taken together, the studies demonstrate that human iPS cells can be easily created by different laboratories and are likely to mark a milestone in stem cell-based regenerative medicine, Plath said.

Repeatability–one of the hallmarks of solid science. Of course, they always have the standard caveat:

“It is important to remember that our research does not eliminate the need for embryo-based human embryonic stem cell research, but rather provides another avenue of worthwhile investigation.”

I think that, at some point, the embryo work will be abandoned, because even for many researchers, it’s ethically problematic. But they will have to do a lot of correlation and validation before they can get to that point.

In any event, stuff like this brings us much closer to escape velocity.

[Via Fight Aging]