My software displays the measurements at a higher precision while the instrument generating the signal rounds its display to the nearest integer while still outputting the full precision measurement, but everything was within rounding distance so I'm going to just call that good.
Doing that cross checking, with the latest hardware and software updates most of the batches are checking out well, but the instrument seems to be biased toward producing a darker measurement. There's a suspicious operation in the code on the instrument that I think could be responsible for this, but I'd like to add some additional debug capabilities to look specifically at how the values in one array change during real batches and see if my intuition on that problem pans out.
This doesn't check that the readings are in any way accurate (that's what cross actual roasted coffees against something that's not a prototype is for), just that the sending and receiving systems agree on what the signal represents.