The Thalmic Myo is an electronic arm band with an IMU and myoelectric sensors, able to measure the orientation and muscle movements of an arm. This device has uses ranging from prosthetics to Minority Report-style user interfaces. Thalmic is also a Y Combinator company, with $15 million in funding and tech press gushing over the possible uses of this futuristic device. Truly, a remarkable story for the future of user interfaces and pseudo-medical devices that can get around most FDA regulations.
A few months ago, Thalmic released a firmware update to the Myo that blocks raw access to the myoelectric sensors. Anyone wanting to develop for the Myo now needs to submit an application and pay Thalmic and their investors a pound of flesh – up to $5000 for academic institutions. The current version of the firmware only provides access to IMU data and ‘gestures’ – not the raw muscle data that would be invaluable when researching RSI detection, amputee prosthetics, or a hundred other ideas floating around the Thalmic forums.
Thalmic started their company with the idea that an open SDK would be best for the community, with access to the raw sensor data available in all but the latest version of the firmware. A few firmware revisions ago, Thalmic removed access to this raw data, breaking a number of open source projects that would be used for researchers or anyone experimenting with the Thalmic Myo. Luckily, someone smart enough to look at version numbers has come up with an open library to read the raw sensor data. It works well, and the official position of Thalmic is that raw sensor data will be unavailable in the future. If you want to develop something with the Myo, this library just saved your butt.
Thalmic will have an official statement on access to raw sensor data soon.
Quick aside, but if you want to see how nearly every form of media is crooked, try submitting this to Hacker News and look at the Thalmic investors. Edit: don’t bother, we’re blacklisted or something.
Update: Thalmic has updated their policy, and will be releasing a firmware version that gives access to the raw EMG sensor data later on. The reasons for getting rid of the raw sensor data is twofold:
- Battery life. Streaming raw data out of the armband takes a lot of power. Apparently figuring out ‘gestures’ on the uC and sending those saves power.
- User experience. EMG data differs from person to person and is hard to interpret.
According to the linked thread, they locked down the firmware “to improve battery life”. I’d love to know how that works.
“Quick aside, but if you want to see how nearly every form of media is crooked, try submitting this to Hacker News and look at the Thalmic investors.”
That’s a serious accusation. Do you have any evidence to back it up?
I was the one who submitted this article here. I also tried to submit this to HN as well. When I did so, the article is added as [DEAD]. Usually, there are domains in which are banned as to prevent SNR from going through the toilet. I get that.
Then I looked at YCombinator’s funded list: Thalmic Labs is on there. I unfortunately cannot find any PG posts discussing his role of not slandering/libeling YC funded companies, but he has said that before when Scribd was attacked for profittering on piracy of books.
Also, I have an idea regarding the firmware. They use a Cortex M4 processor inside the armband. If my back-of-napkin numbers are right, then they could move the detection of gestures inside the armband. That way, there’d be no raw data anywhere. And then they can control dev units by a ‘dev only firmware’ with raw access added back in. And I believe that the iPhone/Android targets already have the gesture-only firmware. This is a guess, because I will not downgrade firmwares to the next version.
So, they did the wrong thing. I’ve pretty much washed my hands of them. I’m instead, I’m looking at building my own. I don’t think it’ll be expensive to build. Maybe $20 for the myoelectric sensors (I’ll make my own using op-amps and AgCl anodes.
Why not make it a project here?
A feature that could make this more useful would be the ability to move the sensors closer/farther apart, more sensors and smaller sensors.
Fiberoptic output of data?
Allowing access to the hardware only via SDK means they’ll be able to control polling intervals and potentially remove expensive code. Sure it’s not right and sure on some occasions battery life will not improve but the rationale is understandable if not flawed.
If you cheap code on the Cortex M4, do the following:
export 9Dof sensors as 9 doubles.
Export 8 EMG sensors as 8 INTs.
Now plot this, with regards to accuracy and polling rate. Now adjust polling rate incrementally. Find ideal spot. Turn it down a nudge, and go with that.
You’re lucky to not need more than a 20 MHz Atmel. Every computer (cell phone, laptop, desktop) is more powerful than a M4. So why process data on it? Locking users out of raw data is the reason.
If they had provided the developer with a recommended set of API calls, they could still impose control on how often the hardware is polled. I don’t know how it follows that ‘We need more control over the interface, therefore the developers need to give us money’.
Do these people not realise that when they pull stunts like this, they make the product substantially less attractive?
That’s precisely my point.
They made a point that raw data had an implementation cost. And in order to make raw data more palatable, they would have to make a special implementation.
In reality, you provide the sensor number, and the data. Surprise. It’s already there. Now, the tools to analyse are not in their SDK. That’s because they use a refined dataset that generally applies to all users. For training data, that’s fine.
Now, give me the choice of using API call using their software, and direct raw access. That would have been bloody simple. And it wouldn’t have cost anything in terms of man-hours. It’s already done!
It’s too bad. I had some awesome applications with this using the raw data. For example, using 2 of these would allow you to analyse the repetitive strain on jobs. We can see groups of muscles triggered, and the more/longer they are triggered, the more RSI risk. And now we can pay $300 for 2, and analyse this area.
Or another purpose: There’s amputees. They may have no hand, or only part of an arm. With raw data and training, an amputee could have computer input from this and go either to a tablet/phone/laptop or other device, like a robotic prosthetic. Hmm… Thalmic could have been in the role of De facto electronic for the Open Prosthetic. But instead they diddle with pre-defined gestures.
Or how about this purpose: We know the kinematics of the arm. And we know where the muscles should be. So, we can put a MYO on each arm, and calibrate the person with predefined poses to show muscular activation. Now, once this is done, you go exercise. The MYOs can then determine which muscle groups were appropriately exercised in the arms, and which were not. And if they made an ankleMYO, do the same for the legs…. Now it’s an intimate physical trainer.
But they have scrapped this market by making it applicable to games and powerpoint. Pity.
They SOLD the investment idea has having access to RAW data.
By fucking removing it they’re taking away the primary feature of having one in the first place.
I will no longer be buying any. fuck these people.
> That’s a serious accusation. Do you have any evidence to back it up?
Do I have evidence that all media is crooked? You mean evidence that the Hong Kong bureau of the Wall Street Journal is beholden to the Chinese government, r.e. student protests a few months ago? Yeah, I have evidence of that, because it’s freely available on the Internet.
That TechCrunch is at least partially responsible for the head of Mozilla’s dismissal because he donated to California’s Proposition 8 several years ago, while editors of TechCrunch donated against it and didn’t disclose it? Yeah, I have evidence of that, although you’ll need to look up the publicly available records for yourself.
Or do you mean evidence in this specific case, where HN may, in the future, manipulate stories that put Thalmic in a disparaging light? No, I don’t have evidence of a future event.
I would invite anyone to create an HN account (requires no emails or anything).
Then submit a link that goes to the main page of Thalmic Labs (dies on post)
Then submit a link that goes to the developer forum and post to HN the happenings there (dies again on post).
Then I post the reddit links I made outlining this situation. It dies, but I think HN has blocks against reddit cross-linking.
I’ve also heard PG talk about killing bad PR articles of companies he funds. Makes a lot of sense. HN is his news/PR arm. For most news, it’s pretty reliable. Just don’t touch his babies.
Just tested. Anything mentioning the keyword “Thalmic” seems to be filtered out.
This is why you get your news from multiple sources. I would not call the “censorship” crooked, but just a product of the bias selectivity of human nature.
HaD prides itself on kickstarter hatchet jobs, except when the project has an -uino suffix on it.
Case in point: Espruino “support” for UART over audio; clearly does not run as you would expect it (2s delay, not sound in theory or proven with loopback tests) but that doesn’t stop Brian from fawning over it.
I don’t call Brian crooked though, just understandably biased.
And that’s a comment from “NotArduino”.
But thanks.
Every single person you meet is biased- heck, if you really hate bias then you’re biased against people with blatant bias!
Bit of a catch-22 really.
There’s a difference though between favouring things when talking about them and actively restricting other people’s discussions on them (assuming the description is accurate)
https://news.ycombinator.com/item?id=8624075
Cue the comments that say ‘dickheads’. Cause everyone has an opinion, not everyone has the hardware.
Well I have to hardware so I can safely say: “dickheads”
They just killed probably the most useful function of the Myo.
The gesture recognition is actually very bad in real use with no signs of any major improvement anytime soon. Even when it works the gestures are a bit pointless and the whole thing feels too forced to be practical. Might as well just use accelerometers or IR systems like the Kinect.
The Myo is now dead in the water and I’ll be selling mine before more people start to realise this.
Be aware, you can downgrade your firmware. Uninstall the SDK (leave the BlueGiga driver), and then use the manual firmware procedure.
Once you do this, then you can start working on real programming, along with machine learning and such. Just treat this as a dumb sensor and handle your processing on the computer.
And I’ll be starting my own Myoelectric band, sans accelerometer/cpu. I’m looking at bare bones, and cheap too. This tech is just too interesting to pass up.
Maybe something that could replace the strap on a watch? Those are somewhat standardised.
I had a few ideas, but velcro (or similar) seems simple enough. I’d also be looking at a heck of a lot more sensors too.
Sensors themselves, you want made out of Silver chloride. You can make these by simply submerging the silver anode/cathodes into chlorine bleach. Leave for a day. Or you can electroplate, being quicker. Note, you want anode / ground / cathode all in AgCl. It has the lowest interface resistance and capacitance.
Next, the ADC. For example: http://www.adafruit.com/product/1083 . But obviously SMT is way to go, rather than nicely made prefab board. Buy however many you need for all your sensors. Uses I2C. You can put up to 4 of these boards on 1 I2C (for total of 16 inputs).
Third. A micro. I’d recommend an Atmel. They work, and nice dev environment with Arduino bootloader firmware. I think the whole line all has I2C.
Fourth: low power BT chip. I like the look of this: http://www.nordicsemi.com/eng/Products/Bluetooth-R-low-energy/nRF8001 . Anyone else has any ideas?
Fifth: batteries. I like the 18650’s you get off Aliexpress. 3.7v, lots of mAh, and rechargeable. And cheap.And since they’re lithium ion, they are also light.
But that’s just my rough draft. Id be happy to work with anyone regarding this.
I’m in the market for silver chloride… oddly enough.
Are you sure about the large battery? It seems cumbersome to me.
@ganzuul regarding large battery…
Well, I just made a Aliexpress purchase of 10 18650’s along with battery holders. No, I’m not entirely sure how big they feel, or if they are the ideal size for this application. However, I was thinking that since they would be on the arm, the extra bulk may not be a big deal.
Of course, this would just be a reason to play with my new batteries and equipment! And testing and experimentation can look like crap the first time around. Polish can be applied later :)
Only ‘dickheads’ in the sense that there are lots of people who bought this (including me) with the expectation they would have access to all the data.
I realize they are a business but don’t like bait-and-switch. Assuming they don’t undo this change in a future version, as long as they don’t prevent downgrading the firmware if one wishes I’m ok with it.
Do you remember PS3 Linux?
Yes, this debacle is PS3 Linux all over again. Upgrade is actually downgrade with increased version numbers.
And that’s why this NEEDS to be put into the community. We either change corporate stance OR we warn others to NOT BUY.
That was actually the first example I though of.
I was irate when the killed the ability to boot Linux in the PS3 — PS3 was actually my first experience with open source and i ported Caml and Coq to the PS3 (then someone from INRIA came along and did it right, but I’m pretty sure I was first ,, ha).
This is Cybiko SDK all over again …
They clearly stated that raw ekg access would not be available. In multiple forum posts on their Site.
Well. Except when they offered access to multiple developers for $100 , $200 , or for $5000 . Then its possible.
Except we proved that the firmware was already giving raw data. Until this week’s update.
And now, with onwrist gestural computation, the MYO gesture detection is worse!
Good Job, Thalmic.
nda from darpa grant??? never….
Is this a recent release of firmware? I have one of these I’ve yet to take out of the box. Assuming the device doesn’t already have the .6 firmware on it I’ll guarantee I won’t be upgrading.
Yes, it is a recent firmware update. For right now, I’m getting no reports of prevention of downgrading firmware. So, uninstall the SDK but keep the BlueGiga driver (f you’re on Windows). And then use the manual firmware updater built in to the device.
Last known good firmware is 0.8.18.2
As to the firmware, get it below. Someone already made a MeGa link.
So, get it here: https://mega.co.nz/#!ycVkHbDQ!_bGo7yn6T0Io3qd0RrsAw4MyEQkonjElljBHBdorTHk
So is backing up the currently loaded firmware just as easy as the manual upgrade (access the internal drive and copy the existing hex file to the local computer)?
I’m not entirely sure if you can back up current firmware that way. I don’t have it with my at the moment.
But I know from other users that firmware updating from the BAD firmware to known good (0.8.18.2) does indeed work… For now.
They could easily do a firmware version check. They’re running a Cortex M4, and a version check is trivial. But that’s all you’d need to lock people out who were afraid of editing Motorola S-Record firmware for version manipulation.
The Leap has a big problem which is developers barely producing stuff for it. And that’s for free. You won’t see any research done on these things with the 5000$ pricetag as any year 3 EE would probably be able to build a device like this. Also having a wireless system would definitely interfere with the data so the only uses of this system would be FDA approved medical equipment.
Maybe it’s a good oportunity to test the alternatives such as somaxis ;)
Unfortunately, I’m only seeing paper-constructed blocky trees on a sphere. Looking further into their website shows little content. ( is this right site: http://www.somaxis.com/ ?)
Would you mind telling us more about Somaxis?
I like the minimal approach of the website.
Anyway:
http://www.qsgear.com/en/somaxis-myolink
I’d like to start by referencing the first line of a post from yesterday.. “The TP-Link TL-WR703n is the WRT54G for the modern era – extremely hackable, cheap, and available just about everywhere.”
As a product designer, I can understand the necessary evil of things like closing environments and needing patents for things. But when it comes to something that can be made world better by the community, whether you (as the company) can think of how or not, I think it’s best to open it up and let people hack it. Let people see how it works and use it to make it better. How many people are going to be buying that WR703n BECAUSE of its hackability, as opposed to all the other closed systems? Especially when it comes to sensors that can be used in things like prosthetics, imagine being the company in the center of a movement that can take your product, coupled with a 3d printer, to make better working hands for children (as an example). It hurts everyone when things are closed off, because then they just get pissed when people do in fact step on their toes and hack it anyways.
I’d really like to see a movement of companies making things more open. Yesterdays’ post was a good sign of things to come, this one, not so much.
They always said they were unsure if raw data was going to be accessable. It drops usable battery life to constantly send the data stream. It was a common question in the forum and was never answered as if they were going to allow access to that in the long run. Not sure why everyone is acting surprised about the raw ekg data. Also I own a Myo arm band.
AFAIK the band always send the data anyway, since the gestures are interpreted in the driver running on the host machine.
Also if the battery was the problem why is it suddenly OK once you paid the $5000 fee?
gkmac said:
“AFAIK the band always send the data anyway, since the gestures are interpreted in the driver running on the host machine.”
Not anymore. The MYO has a Cortex M4. The most recent driver “update” cripples the device by moving gesture detection to the Cortex M4 directly. It costs more power, gives worse performance, AND removes raw access. Why? So they can sell it back to you for $$$
1. No official Linux support.
2. It’s BT low power for a reason. Cortex M4 processing costs more than uBT transmit.
3. The firmware “Upgrade” is a downgrade WRT to access and quality. Reports coming in that gestures with new firmware are worse.
4. Raw access was pulled because they want to charge $100 / $200 / $5000 academic for raw access. This is ON TOP of $150 for the bloody device. Multiple reports in forum corroborate. And, I received the $5000 email when asking for raw support.
5. The device was initially publicized as Open Source friendly and raw access for community. They have backed out of that understanding.
They made this decision for profiteering reasons. It _might_ have to do with the battery. If that’s the case, then allow easy changing firmware from the processed_onboard to raw_access and back. If you want something awesome, give the SDK for direct programming the armband.
Someone can do it better. And we in the Linux community, with thanks to Dzhu, we have done better. With only 10 minutes of training, I had 10 unique gestures. But you *might* get a finger snap if everyone gives enough gestural data to Thalmic. Boo.
UPDATE:
https://www.thalmic.com/blog/big-data/
RAW EMG DATA OPEN TO DEVELOPERS IN DECEMBER
I just got the email! Well, things are looking quite a bit forward then!
@Brian Benchoff Can you please include this update? I do believe it’s sincere. We need to inform the users of this most delightful change!
Done
Thank you :)
If they’re going to do the right thing, I want to praise them. Well, that and buy their stuff and work on cool things.
Here’s the release: https://www.thalmic.com/blog/big-data/
Also, according to Alex Kinsella (Thalmic’s product manager), there won’t be a surcharge for data access: https://twitter.com/alexkinsella/status/534745912191750144
This is probably not the right place to mention/bring this up, but how is the “Quick aside” section of this article adhering to the principles of HaD articles/posts?
“Our playful posts are the gold-standard in entertainment for engineers and engineering enthusiasts.”
They just figured they screwed themselves over.
Why block access on a V1 device when you can get the community to do all the hard research ready for a V2 device.
Ok let’s open her up again……
I’m all for making scipi-learn implementations even better! Machine learning techniques are awesome.
And frankly, if Thalmic puts in the time for a myoelectrical corpus, I’d pay to use a refined one. Corpus dictionaries are bloody expensive, and for good reason! It takes thousands of hours of samples to generate even a mediocre corpus. In Googles case, they used 7ish million hours of voice to do voice 2 text.
What I implemented is the short one person, trained to only one position on the arm; individual corpus. It would require training until enough data was made for a person’s arm.
Not to mention the free publicity they get and they come out the other side looking like the good guys in the eyes of some.
I for one am seeing through this charade. Its greed all the way.
Not really.
They were A/B testing the users on their forum. They certainly were not planning on free raw access. They were targeting either $100 or $200.
After I worked with and made Dzsu’s project viable, I submitted all of this to HaD.
And.. here we are.
I see great things coming out of this device. And I think they were being incredibly shortsighted with nickel and diming the developers. Some of my foreseen applications will require 2 bracelets. In fact, most will. They also have a well noted upgrade path (More electrodes, fits legs, torso, control box for all myo devices ).
Emotive did the same thing with the epoc. I still don’t have one because of that.
http://en.wikipedia.org/wiki/Talk:Emotiv_Systems#Removal_of_URL_and_reference_to_illegal_usage
Astroturfing, an old time-honored tradition of Wikipedia users (read: editors). :D
This looks a lot like the device Meta/Facebook touted as its future for reading muscular input for alternative VR hand tracking.