Saw this trick shared by openbuilds Facebook group member David Gbbonly. He used double sided tape to stick a digital caliper (these can be got at princess auto/harbour freight for like $20) to the c beam and used it in the calibration process built in to the openbuilds control software. Genius! After calibrating I was able to repeatable get within +/- 0.003 mm of accuracy.
WOW! I like. Ok so I was using the method of using a 90 degree chamfer bit pecking a small indent move 100mm peck another indent. Measure with calipers. But that even still has possibility of not being completely centered in the indent. Going back to re calibrate using this method. Thanks for sharing.
I've been doing that for years. I have a 12" set just for calibration. I made a mount for mine, but the double stick tape idea is super. It a lot less hassle than my mount.
It's a neat idea but I still prefer to do it with a measuring tape since I prefer the longer distance for the calibration. He uses 100mm where as I use at least 1000mm. I think the accuracy is better, but maybe not
a good test would be to see what your results after by sticking the caliper on and having it move 100mm without calibration.
Maybe - but I think then you’re also subject to how accurate your measuring tool is - how good is your tape measure? From research it looks like your common tape measure could be 1/16” off or more when measuring a distance longer than 12”. That’s 0.0625mm, where the caliper is measuring to the 0.001mm. Check this out: Precision measurement 101
I'm not trying to go into a pis***g match but I'll tell you what I did. First, I use a tape, that I bought at a Dollar Tree for a buck that has both metric and inches. I use the metric side. Second, I don't measure 12", as I've posted, but, at least, 1000mm which is just over 39". Third, if I have an error, it would be in the less than 1mm range. Most of the stuff that I cut will have to be measured with a tape or a caliper like the one @sharmstr shows in his post. That alone negate all inaccuracy's I may have!
I did mine the same way, but I didn't slum it at the Dollar Store. No sir. My 48" aluminum measuring stick is Pittsburg branded from Harbor Freight! Also that 1/16" off with a measuring tape may be because many people do not know how to use a measuring tape properly. That 1/16" off is the thickness of the metal hook part I bet. I have a measuring tape that is very expensive for surveying and is dead on. Every tape measure I own I tested against it and they were well within 1/16" inch. Probably because tape measures used in commerce have to be accurate to 1/32". Going the Distance on National Tape Measure Day.
The question is whether the tape is off or the tape measure is off. Tape measures are off by 1/16"-1/8" over 36" consistently- I know this because I tested a bunch: Login • Instagram The printed tape itself is unlikely to be off by more than 1/32", per NIST standards, and can be used from a non-zero starting point with a spindle pointer. However, from an experimental error point of view, 0.001" over 12" is technically better than 0.032" over 48". More averaging isn't always more better. That said... In general, machine calibration is tricky, because drive systems have periodic and random error as well as directional error- backlash and torque strain. So every part of the machine has a different error and that error also changes depending on which direction the axis is travelling as well as other forces like cut load and feed speed. So debating the relative merits of semi-accurate zero-or-one-dimensional measurement systems is kinda pointless as long as whatever's used can get into the ballpark of "close enough for what my intended work is". There's always going to be something that's out of spec somewhere you haven't specifically measured unless you're using $1500 ball screws. And they're just gonna show up the inadequacies of the rest of the machine. Considering grbl doesn't even have simple single-value backlash compensation, it's not really reasonable to expect real axis calibration tables for it any time soon. Maybe once the 32-bit versions undergo further development and maybe have some serial IO, but I'm not holding my breath. For the time being, tape measures and 12" digital calipers are basically gonna be fine. Experimental rigor would help make the most of limited tools- proper clamping, elimination of cosine error, multisampling, subdivision and linear regression, whatever, but mostly they're all gonna come up with about the same general quality of results. If I was happy with the results of a tape measure and pointer, I wouldn't run out and buy a digital caliper. But if I already had a digital caliper, I'd probably err more toward using that, maybe with the tape measure for some long-distance consistency checks.