I do BIAB no sparge, with recipes I modify and calculate in Excel using the formulas from Designing Great Beers by Ray Daniels.
I had been brewing 5-gal batches in a 10-gal pot, but recently began calculating a recipe for 6-gals. This of course, leads to a larger grain bill. My calculations were as follows: 13 lbs. of grain adjusted for the extract potential of each type of malt at a stated mash efficiency of 75 % predicting a Target OG of 1.055 using 8.36 gals of treated water for mashing, draining and squeezing the bag to extract all the wort I can, then boiling for one hour.
At the end of the mash, I had 7.1 gals of wort with a hydrometer reading of 1.0364 (adjusted for temperature). Dividing the total gravity of the wort (7.1 times 36.4) or 258.44 by Potential GUs at 100% efficiency (440) yields a calculated mash efficiency of 58.78 %.
Lousy yield! But after the boil when I take a hydrometer reading adjusting for temperature, my Measured OG is 1.0551 – exactly what the original formula predicted using a mash efficiency of 75%!
So how can I get the predicted OG, if my efficiency is that far off.
The only thing I can think of is that somehow the sample of wort (which has hop debris and occasionally some grains that escaped the grain bag) that I am using to calculate the mash efficiency is not representative of the gravity of the entire volume of the wort.
The next time I brew, I may try straining the wort sample before I take a hydrometer reading to calculate my mash efficiency.
Thanks is advance for any helpful comments and advice.