|
Post by zenbillionaire on Feb 4, 2011 15:34:24 GMT -4
I recently came across an argument against the credibility of the Apollo mission based on the film documents. I hadn't it heard yet and I'm looking for a rebuttal, thought maybe someone here could help out.
The premise is that film stock can't tolerate the extreme temperatures encountered on the lunar surface, which were estimated to range between -153C in shadow and 107C in direct sunlight. Having some limited experience with film stock (knowing for example that it's basically plastic) I found the assertion superficially credible, but wasn't able to find anything quickly with Google that would either confirm or deny the claim.
Does anyone know the type of environmental controls used to mitigate these extremes and their effect on the film? I've heard the cameras used were variations on a commercial Hasselblad with a polished aluminium case and that aluminium is an excellent heat conductor. I can understand how polishing the case to a mirror finish would reduce absorption of radiant heat, but not how it would reduce dissipation significantly. How did they manage to keep their film from freezing and shattering? Electric heaters? First I thought they were somehow heated through conduction from the suit, then remembered they were removable and mounted on the outside of it. I can't imagine the suits bleeding enough heat to warm a camera and still being able to keep the occupant from freezing, certainly not through any sort of vacuum gap. My idea would be to put some sort of glass isolation mount between the case and the magazine but I can't find anything saying that was the technique used.
Maybe the cameras were so small (and the mass so relatively large) they had enough thermal inertia to stay within operating temps for the duration of the EVA with no special design considerations? I am no longer proficient with the Laplace transform...
Another person pointed out that ionizing radiation has a known negative effect on film, causing it to fog. Evidence of this behavior is common. Since there was no obvious shielding, why aren't the photos fogged?
|
|
|
Post by gillianren on Feb 4, 2011 16:09:56 GMT -4
The issue is "obvious." The other issue is "temperature of what?"
|
|
|
Post by Jason Thompson on Feb 4, 2011 16:15:40 GMT -4
I can understand how polishing the case to a mirror finish would reduce absorption of radiant heat, but not how it would reduce dissipation significantly. Same way. Reflective surfaces are poor absorbers and porr emitters of radiant heat. The film is not in thermal contact with the outside of the casing. Any ionising radiation argument is useless without numbers. Why aren't the photos fogged? Should they be? Are the levels of radiation, and the types of radiation, significant for film fogging? Remember, they have the burden to prove their argument right, not the right to expect you to argue your case to disprove them.
|
|
|
Post by randombloke on Feb 4, 2011 16:26:12 GMT -4
To say that the surface of the Moon varies between -lots and +lots is accurate, as long as you remember that you're dealing with the surface of the Moon. That is, the actual dirt on the actual ground. The cameras, however, spent most of their time clipped to the astronauts' chests while the astronauts themselves went to great pains not to fall flat on their faces.
Thus the surface temperature of the moon is mostly irrelevant to the cameras' temperatures due to there being almost no conduction path (through the astronauts themselves, in their actively temperature-controlled suits?) and no convection path at all, leaving radiative transfer as the only way for them to change their temperatures. And as anyone who's ever owned a vacuum flask will know, radiative transfer is very, very slow.
That is, the answer to your question is: They were heated by the cabin heaters in the LM then taken into one of the best insulated environments known to man, where they proceeded to lose very little energy at all, even assuming they were in shade the entire time.
What I'd like to know is how come no-one makes this "argument" about the perfectly ordinary mechanical watches they wore on the outsides of the suits; I'm pretty sure that the tiny, tiny springs in those are at least as sensitive to extreme temperature variation as film.
|
|
|
Post by Obviousman on Feb 4, 2011 18:01:08 GMT -4
Tell the person who wants to know that they should go to the NASA Technical reports Server and download NASA CR-92015 "A study to determine the optimum design of a photographic film for the lunar surface hand-held camera - Final report" and NASA SP-5099 "Photography Equipment and Techniques - A Survey of NASA Developments".
It's all in there, like it has been for decades.
|
|
|
Post by trebor on Feb 4, 2011 19:30:23 GMT -4
The premise is that film stock can't tolerate the extreme temperatures encountered on the lunar surface, which were estimated to range between -153C in shadow and 107C in direct sunlight. The film was not on the lunar surface, it was separated from it by the body of the camera and quite a bit of vacuum. Nor was it in direct sunlight. The film used had a polyester base which was specifically designed for low temperature use. As an additional note here are the thermal properties for the film type from Kodak documentation: 255° C (490° F) Melting Point. Solid becomes fluid. 130° C (255° F) Distortion and Shrinkage. Crystallization Occurs. >100° C (212° F) Distortion can occur with non-uniform heating. 100° C (212° F) Shrinkage up to 0.15% occurs, stabilizes in 24 hrs. 82° C (180° F) Shrinkage up to 0.06% occurs, stabilizes in 48 hrs. 80° C (176° F) Transition Temperature. Increase in Volume. Film loses some stiffness. 49° C (120° F) Shrinkage up to 0.02% occurs, stabilizes in 10 days. For most of the trip the film was well shielded behind the walls of the Command Module. For the rest the radiation levels were not high enough and the film not sensitive enough to radiation for fogging to be significant. The paper : "SENSITIVITY OF PHOTOGRAPHIC FILM TO NUCLEAR RADIATION IN NEAR-EARTH MISSIONS" By Edward L. Noon and Richard R. Brown May be of interest here. It shows a great variation in the sensitivity of film to radiation as well as the effectiveness of minimal shielding.
|
|
|
Post by ka9q on Feb 8, 2011 4:20:18 GMT -4
These film temperature objections have been around for years. They demonstrate one of the major categories of misconceptions that help make Apollo hoax claims seem superficially plausible.
That category is heat transfer in a vacuum. People have an intuitive notion of what it means when the ground is at such-and-such a temperature. But that intuition is based on a lifetime of living on the earth at the bottom of a fairly dense (1.2 kg/m^3) atmosphere that holds and carries heat quite well. So when the ground is at a certain temperature, the atmosphere helps ensure anything and everything in the area will assume roughly the same temperature.
This just isn't so on the lunar surface. Without an atmosphere, objects transfer heat only by radiation, a much less effective mechanism. Every object heats or cools until it reaches an equilibrium temperature where it radiates exactly as much heat as it absorbs (and produces internally, in the case of a powered machine or living organism). Because each object can have its own surface optical properties and be exposed to a different set of neighboring objects and temperatures, a set of objects on the moon can assume wildly different temperatures compared to the same objects on the earth's surface.
So with the proper thermal design the Apollo camera could keep its film at a comfortable temperature even when the surface around it is freezing or broiling. It reached an equilibrium temperature that could be controlled by applying the proper coatings to the outside of the camera.
Just looking at a given thermal coating only tells you part of the story: how it behaves in visible light where the sun radiates most of its power. A light object obviously reflects more solar power than a dark object.
The other side of the story is how the surface behaves in the far infrared region where any object anywhere near room temperature radiates. (This wavelength is around 10 microns, vs the half to one micron wavelengths of visible and near infrared where the sun, a much hotter object, has its radiation peak. IR-type passive motion sensors work by looking at the 10 micron far IR radiation emitted by the human body.)
You can find materials with every possible combination of visible and far-IR properties. Some materials reflect visible light very well but absorb (and radiate) efficiently in the far IR; these run quite cold even in direct sunlight. They're especially useful for radiators like those inside the payload bay of the space shuttle.
Other materials absorb nearly all of the visible light falling on them (i.e., they look relatively dark) but radiate poorly in the far IR; these obviously run extremely hot in the sun, so they'd make good thermal solar power collectors. (Interestingly metallic gold is in this category.) And other materials are light (or dark) in both the visible and the far infrared. By picking the right materials to cover your device you can usually have it settle down at whatever comfortable temperature you choose, even though it may be close to other objects that are very hot or very cold.
The hard part is maintaining a constant temperature under changing conditions. That's where thermal engineering becomes a real challenge. Fortunately, the amount of thermal power radiated by any object varies as the fourth power of its temperature, so a small change in temperature can result in a very big change in thermal output. This keeps the change in temperature with changing conditions smaller than it otherwise might be.
|
|