Sometimes, a project turns out to be harder than expected at every turn and the plug gets pulled. That was the case with [Chris Fenton]’s efforts to gain insight into his curling game by adding sensors to monitor the movement of curling stones as well as the broom action. Luckily, [Chris] documented his efforts and provided us all with an opportunity to learn. After all, failure is (or should be) an excellent source of learning.
The first piece of hardware was intended to log curling stone motion and use it as a way to measure the performance of the sweepers. [Chris] wanted to stick a simple sensor brick made from a Teensy 3.0 and IMU to a stone and log all the motion-related data. The concept is straightforward, but in practice it wasn’t nearly as simple. The gyro, which measures angular velocity, did a good job of keeping track of the stone’s spin but the accelerometer was a different story. An accelerometer measures how much something is speeding up or slowing down, but it simply wasn’t able to properly sense the gentle and gradual changes in speed that the stone underwent as the ice ahead of it was swept or not swept. In theory a good idea, but in practice it ended up being the wrong tool for the job.
The other approach [Chris] attempted was to make a curling broom with a handle that lit up differently based on how hard one was sweeping. It wasn’t hard to put an LED strip on a broom and light it up based on a load sensor reading, but what ended up sinking this project was the need to do it in a way that didn’t interfere with the broom’s primary function and purpose. Even a mediocre curler applies extremely high forces to a broom when sweeping in a curling game, so not only do the electronics need to be extremely rugged, but the broom’s shaft needs to be able to withstand considerable force. The ideal shaft would be a clear and hollow plastic holding an LED strip with an attachment for the load sensor, but no plastic was up to the task. [Chris] made an aluminum-reinforced shaft, but even that only barely worked.
We’re glad [Chris] shared his findings, and he said the project deserves a more detailed report. We’re looking forward to that, because failure is a great teacher, and we’ve celebrated its learning potential time and again.
This is a gem: “and it turns out that anyone, presented with a light-up broom that measures their strength, will instantly apply as much force as they can to test it”
Seems like using machine vision (a camera on a large pole for field of view and/or a fisheye lense with software correction) would be more usable for tracking the acceleration of the stone, combined with the accelerometer data?
And the broom: can’t even begin to think about how one might make that a reality. Perhaps put some FSR’s or FSR-like ‘strands’ between the regular bristles, and see what kind of data you get when sweeping. Perhaps something else could be done with the bristles, but that surely would take a lot of calibrating if it ever would gain usable data, and might impede the handling of the broom too much too, I guess…
I agree with Geert, adding cameras and computer vision might be the more promising approach.
You could use flex sensors to measure how much an off the shelf broom is bending? I don’t know what kind of sensitivity a flex sensor has, but at least you have a broom that works as a broom.
Yes, a flex sensor (strain gage) would be the right tool. Glue it on a regular broom and the broomstick itself becomes the load bearing, bending beam.
A system is already designed and widely deployed to do – among other things – the OP’s wants. Television studios already present real-time augmented reality info-graphics on the curling ice and show overlays with the positions of curling stones in the match area.
The dimension of the curling area are specified by the governing bodies, tightly controlled at venues, and marked in high-contrast on the ice. Setup cameras anywhere all directed at the match area. Take a high resolution still photo with no participants on the ice and the boundary markings in-frame for a calibration reference. Apply a correction profile to eliminate any non-linear distortion (fish-eye) based on the static characteristics of the lens. Then interpolate the x-y match play area dimensions along the frame’s 4 corner reference points. Place a high-contract feature on each curling stone that can be located in each frame’s 2D x-y pixel coordinate space. Once translated to 3D virtual space, it’s z offset on the real stone can be dropped to the floor of the virtual match-play area. The more cameras involved, the more accurate the composed-position of each stone will be. And the processing/analysis can be done in post long after the match is concluded – unlike television systems that must do AR compositing in real-time.
TV studios have been doing this for all sporting events dating back to the 1st & Ten graphics system in late 1990s American Football.
Although I’m sure that works fine, that’s *slightly* more involved than strapping a sensor-laden teensy to a curling stone and then dumping the data via USB after the throw (my attempted solution).
Not related to curling, but related to tracking a thrown object:
I recently went to the 2019 World Championship Punkin Chunkin, (they moved it to Illinois, so it was only a 30 min drive for me).
And the majority of the down time between shots is locating where the pumpkin landed to measure how far it traveled.
Spotters in golf carts drive out and look for the impact crater GPS locate it and mark it, exit the firing range, and then the next pumpkin can be fired.
Might make a good Ask Hack A Day article:
How do you make a GPS enabled pumpkin that can survive being shot more than 4000ft (1200m), survive the impact with the ground, and not effect the structural integrity of the pumpkin.
My crappy cell phone video of the event:
https://youtu.be/JVLGfQutNxg
If you make it cheap, it doesn’t even need to survive: just three or four transmitted gps fixes after launch would give you a pretty good estimation of its trajectory. The golf crowd has a lot of stuff they’ve done with optical tracking of drives, as well.
Isn’t the stone in a ‘freefall’ once it is let go and therefore there is no acceleration and the ice is sooooo slipper the friction is so small there is effectively no deceleration?
I agree, an image based system is probably better.
There must be deceleration on the ice, otherwise it would ALWAYS hit the barrier at the border of the play field. There is also acceleration to the side, when it changes it’s direction. Of course it gets complicated when the stone rotates.
This company seems to be able to monitor curling rocks successfully… https://www.klutchcurling.ca/download
There is a much comfortable solution coming into the market soon with No touch to the stone.
http://Www.angularvelocity.fi worth tracking
How about something like an optical mouse sensor? Someone should see if an optical mouse can track on ice. If yes, find a way to mount a sensor on the stone. If not, perhaps a similar system could be made which looks *up* at the ceiling to sense motion.
I agree with the other posters: An imu isn’t going to work in this application.
A camera on a pole above the play area will be a more promising approach. If you include a thermal image sensor (expensive, I know) you could also measure the effect of the brushing on the ice. This would give you an “all in one” solution without having to modify any of the playing equipment.
I wonder if this might be a good application to use/abuse the Lighthouse VR tracker system? The small hand trackers that were posted a while back here would probably easily fit the stones and I think the lighthouse system should have the coverage and accuracy to get very nice position data on the stones. The only problem might be reflections of the lasers on the ice. (I know, not exactly a cheap solution if you don’t already have the lighthouse beacons to begin with.