So, as mentioned yesterday, I got an email asking me about the weird scandal involving the Patriots and underinflated footballs, so I wrote a piece for the Conversation on the subject. since a few people had beaten me to citations of the Ideal Gas Law, though, I decided to bring my own particular set of skills into this, and did an experiment.
You can see the basic set-up at the link– I got a couple of footballs from the athletic department and stuck them in the freezer, then used one of the PASCO pressure sensors we have for the intro labs to measure the pressure. For a popular article, of course, I didn’t go into much detail about this. That’s why I have a blog…
So, the relevant physics here is the “Ideal Gas Law,” usually written:
In this context, P is the pressure in the ball, V the volume of the ball, T the temperature, n the number of moles of gas inside the ball, and R a constant to make the units work out right. This says that the product of pressure and volume is proportional to the amount of stuff inside the ball multiplied by the temperature, and is one of the founding equations of the subfield of thermodynamics.
Something like a football is made of fairly stiff leather, and once it’s mostly inflated, it really doesn’t change volume very much. Which means that unless you let air out of the ball, reducing n, the pressure is proportional to the temperature. If you decrease the temperature, you decrease the pressure, and vice versa. This leads to all sorts of fun effects– my favorit silly example is that little curl of mist you see at the lip of a freshly opened bottle of beer in tv commercials. The beer inside the bottle is under pressure, and when you pop the cap, the pressure drops suddenly, which leads to a corresponding decrease in temperature. And, if you’re in a somewhat humid environment, that produces a little mist as water vapor in the air condenses.
For the case that’s producing my current fifteen minutes of fame, what matters is the pressure change that comes from a change in temperature. One possible innocent explanation of all this might be that the balls were inflated in a warm place, then the game was played in a cool place, and the pressure decreased as a result. So, how plausible is that?
Well, you might say that the temperature change is pretty substantial, even for a relatively warm game– if it was, say, 75 degrees inside, and the game was played in 50 degree temperatures, well, that’s a change of about 33%, right (25 degree drop out of 75 degrees to start)? The problem is, those temperatures are in Fahrenheit (America, f&*k yeah!), and we’re talking about physics. the temperature that matters here is the temperature in Kelvin, measured starting from absolute zero. In which case the change is just about 14 degrees out of nearly 300. Not nearly enough to produce the 15% change in pressure found by the investigation.
But a bigger change, like from my simple freezer experiment, can show a significant change. Since it’s not science without graphs, here are some graphs showing the data from the two balls:
There are four data points here: The initial pressure in the lab before the experiment, the pressure after spending a night in the -20C freezer, the pressure after spending three hours in the crisper drawer of the fridge, and the pressure after warming back up to room temperature. I’ve plotted these versus the temperature in Celsius, though I’ve mixed unit systems by measuring pressure in pounds per square inch, since that’s the unit used by the NFL (America, f&*k yeah!).
As you can see, the pressure measurements for the two balls fall nicely on two slightly different straight lines, reflecting the slightly different starting pressure (the balls were flat when I got them, and I inflated them using the battery-powered compressor in my car, whose pressure gauge isn’t designed for inflating footballs to pressures below 20psi…).
Looks pretty good for the Ideal Gas Law. So, what else can I do with this? Well, I can fit a straight line to these, and work out the value of absolute zero– the Ideal Gas Law suggests that as you continue to decrease the temperature, you should eventually hit a point where the pressure goes to zero. This defines the origin of the Kelvin scale. And, in fact, this is how absolute zero was first determined, more or less– by measuring the pressure vs. temperature for a bunch of different gases over a wide range, and extrapolating to zero pressure.
So, the lines on the graph above are just that: the results of a linear fit to each of the data sets. From these lines, I can work out two values of absolute zero, and get -314 and -307 Celsius, respectively, with an uncertainty of around 29 Celsius. The actual official value is -273.15, so this is pretty good. I’m a little surprised at just how well that worked, to be honest, as air isn’t really an ideal gas, and the volume of the football isn’t perfectly constant, and the temperature range I could access easily isn’t that large. But, yay, thermodynamics.
Of course, it’s also a little inelegant to have those two lines on there, since the Ideal Gas Law is supposed to be a universal principle. So, I also did a very physicist-y thing, and made a “normalized” graph. The idea here is that I don’t actually care about the exact pressure, or volume, and I definitely don’t care about how many moles of gas are in the ball. All I really care about is the relative changes in these things. So, I can work in terms of ratios of different measurements:
A little algebra gives (canceling out R, which is a universal constant):
I call this a very physicist-y trick, because it makes life so much easier– I don’t need to worry about the absolute value of any of these things, or even what system of units I operate in. All I need to do is take ratios– divide each pressure measurement for a given ball by one of the other values, and divide each temperature (in Kelvin or whatever the stupid system is that uses Fahrenheit degrees but starts at absolute zero) by the temperature for the same point. All of those data points should fall on a single, nice, universal straight line.
So I did that, and:
There are actually two datasets there, but you can barely tell, because the red points fall almost exactly on top of the black ones. I didn’t plot a best-fit line for these, but the agreement is excellent.
So, the Ideal Gas Law really does work extremely well. Hooray for physics!
There is one minor mystery here, though, which was pointed out to me in email, namely that there’s some ambiguity about the pressure measurements. What you usually measure is “gauge pressure,” namely pressure above the ambient atmospheric pressure. So in that case, the pressure inside the ball should be not just what the gauge reads, but the gauge reading plus about 14 psi. In which case, the 2psi change reported in the “Deflategate” story ought to be a smaller percentage change, and thus obtainable via a smaller temperature change. I didn’t think of that when I initially did this experiment, but the change I see agrees really well with just sticking these numbers into the Ideal Gas Law. So I don’t know exactly what’s going on, here– maybe the sensors I was using were measuring absolute pressure rather than gauge pressure, but that would mean the balls were significantly underinflated, while they were very definitely overinflated, going by the feel of the ball.
So, we’ll leave that dangling as an issue to be resolved– call it homework. If you know what’s wrong with this in terms of the gauge/absolute pressure difference, leave a comment and let me know.
from ScienceBlogs http://ift.tt/1BkAIic
Aucun commentaire:
Enregistrer un commentaire