- This topic has 64 replies, 16 voices, and was last updated 1 month ago by .
- Topic
Disclaimer – I’m well aware this is both a first world problem, and highly unlikely to affect the quality of my coffee, I’m just interested in principle, particularly views from anyone in metrology / engineering / science fields.
I was recently given some coffee scales, which are supposedly accurate to 0.1g. (For context, to make an espresso you use just e.g. 18g of beans, so in theory being able to tell the difference between 18g and 18.9g of beans could make a difference to flavour (see disclaimer). The scales are not hugely expensive, but are from a reputable kitchenware brand.
I usually ‘pour’ beans in, then slow down as I close in on 18g. I noticed that if I poured to 17.Xg and then added beans slowly as I closed in on the magic number, it seemed impossible to get the scles to move in 0.1g increments – I could add say 3 beans before they changed, and then they would jump by 0.2 – 0.3g.
As a test, I then saw how many beans I could add to an empty scale before it registered anything. If I added the beans one, two, three etc at a time, I found it was possible to weigh (what turned out to be) 20+g of beans without the scales registering anything at all. I found that the scales wouldn’t register anything unless I added 7 or more beans at a time. I appreciate it’s variable, but 7 beans seems to weigh about 1g, putting each bean at more than 0.1g.
Are the scales faulty? Or am I misunderstanding what accurate to 0.1g means?
- You must be logged in to reply to this topic.
