Measuring moisture content in tobacco with a meter is tricky business. In the current edition of NASCP's The Pipe Collector, I co-authored an article with my good friend, Mike Zicha which examines how moisture content and bowl size affect the smoke.
One challenge we faced was how to accurately measure moisture content. We had a basic moisture meter but quickly discovered that getting consistent readings was very difficult because of how they work. Essentially, they're an ohmmeter, measuring electrical resistance between two probes. So, the more compressed the tobacco, the lower the resistance, which results in higher moisture readings.
I had been corresponding with Greg Pease about how these measurements were done in the tobacco industry and quickly determined that it was far too costly and expensive for our research. When I shared with Greg the problem we were having with consistency, he recommended measuring a fixed weight of tobacco in a fixed volume. That greatly improved repeatability, but we had to be satisfied with a repeatable approximation.
A more reliable method for measuring the relative changes in moisture is by weight. For the purpose of this test, I assumed the tobacco's starting moisture by weight to be 17%. In all likelihood, that's a little low, but at the time I started this experiment, I knew nothing about measuring hydration. Let's assume that I had 100g of tobacco, I'm guessing that it contained 17g of water at the start. If half its moisture is lost, leaving the tobacco almost crispy, the tobacco's weight would drop by 17*.50, or 8.5g. The weight of the tobacco would then be 91.5g.