Imagine how hard it could be to add a touch screen to a Mac laptop. You’re thinking expensive and difficult, right? How could [Anish] and his friends possibly manage to upgrade their Mac with a touchscreen for only a dollar? That just doesn’t seem possible.
The trick, of course, is software. By mounting a small mirror over the machine’s webcam, using stiff card, hot glue, and a door hinge. By looking at the screen and deciding whether the image of a finger is touching its on-screen reflection, a remarkably simple touch screen can be created, and the promise of it only costing a dollar becomes a reality. We have to salute them for coming up with such an elegant solution.
They have a video which we’ve put below the break, showing a few simple applications for their interface. Certainly a lot less bother than a more traditional conversion.
[wpvideo fnmWafzC]
i’d buy that for a dollar
https://www.youtube.com/watch?v=pBniPaKP3mc
Most awesome hack of 2018 till now… Wow, people…
You get a new feature and solve a potential privacy issue at the same time.
I don’t know why Apple refuses to make a touchscreen Mac laptop or a two-in-one. You have these options to choose from on the PC side. With Apple, your options are limited to what they offer
Apple doesn’t make a touchscreen laptop because it would be a terrible user experience. Two words are sufficient: “gorilla arm.”
Another two words: “finger print”
I am used to killing everyone who dares to touch my screen with their fingerprint-printing fingers. :P
Well, that’s really one word…
But people accept fingerprints all over their phones and tablets (you just clean them periodically), so that’s not nearly as fatal an issue as the aforementioned gorilla arm syndrome.
His Steveness talked about this many times. Apple has done a ton of research (repeating research done many times before dating back to the early 80s), and vertical touch screens, with a very few exceptions (basically kiosks where the maximum user interaction is measured in seconds or a handful of minutes), make baby Jesus cry.
“Gorilla arm” is result of using a vertical touchscreen too long. Usually users use a phone or a tablet horizontally, not vertically.
Three words. Dont buy apple.
Much like George’s book, Apple’s 1984 advert wasn’t supposed to be the game plan.
>“gorilla arm.”
No clue what that is. I’ve used touchscreens for years and never had an issue.
Everybody has used *horizontal* touchscreens for years – it’s how every smart phone and tablet work. But consider how those are used. Most people hold them at quite low angles relative to their eyes and more often than not are touching the body of the device with some part of the same hand that isn’t doing the pointing.
That stands in sharp contrast with the expectations of a touch-screen laptop. They’re used at quite high angles relative to the eyes and you’re expected to be able to make very, very fine motor movements to direct the pointing finger mainly with the muscles between your elbow and shoulder. Even typing that description in makes me feel dumb.
When I worked on Chrome team, we got first-edition Pixels. I used it as a plain-old laptop for like a year before using any of the touchscreen functionality. Within a year of starting to use the touchscreen, I was surprised to realize how often I was touching my MacBook Pro’s screen, and even sometimes my desktop screen, before sheepishly realizing that it wasn’t going to work. When I left Google, I got a touchscreen Chromebook for myself (an Asus, Pixels are too spendy for me!), because I really do value it. I’m still a bit bemused by the entire thing, because I was very anti-touchscreen.
It’s honestly not a problem. I find that there are touchscreen things, and there are trackpad things. Touchscreens suck for positioning your cursor, for instance. But they’re quite nice for moving a map around, or scrolling, and _sometimes_ for pushing buttons. I’d say I use the touch screen maybe 1/4 as often as the trackpad, and I’d be willing to pay an extra $100 or so for the feature.
Now, the Pixel C, an Android tablet with a keyboard add-on, pissed me right off. I tried using it for about 15 minutes before I returned it to facilities, it frustrated me so much. And my experience with my wife’s iPad-with-Bluetooth clamshell wasn’t much better. Those rely too much on touch for my tastes, and the keyboard just made me keenly aware of how poorly my expectations were being met. My expectations run towards technical work, I could maybe see them working well for writing up blog posts or emails.
Same reason why ALL ‘air’ or ‘free’ or ‘3d’ input devices the user has to hold up in midair have been commercial flops. It’s quickly tiring to hold your arm up and waggle it around for a while. For dedicated uses like manipulating a 3D model it’d work because it’s a singular purpose use, like sculpting some clay.
A 3D or vertical 2D input device is just not practical for a 2D interface for full time use, almost nobody has any problem translating horizontal 2D input device motion to a vertical 2D UI.
It’s obvious you’ve never used a touchscreen laptop for any significant amount of time. “Gorilla arm” as an argument against is a fallacy. On a laptop, you wouldn’t use touch exclusively; you use it in conjunction with the keyboard and mouse/touchpad. A touchscreen is perfect for direct gross movement scrolling, zooming, dragging and selecting and it’s much more quicker and productive to do so. So much so that it’s annoying to switch back to a laptop without it.
That’s what trackpad gestures are for. It’s far more efficient to move your hand slightly down from the keyboard to manipulate the trackpad than to move it upwards not only easily twice as far but in a different plane to mess with the screen.
Trackpad gestures correspond only indirectly with the action on screen with the user entering input on a plane different than and in a position away from the screen and so impose an additional cognitive and physical burden. Touchscreen gestures directly correspond to on-screen actions and are executed quicker and more precisely. Trackpad gestures only imitate direct screen manipulation and you must first locate your hand on the trackpad away from the thing you are trying to manipulate to the location of trackpad and then perform a gesture at non-1:1 scale while staying within the trackpad limits. Reaching up to manipulate the screen is more natural, quicker, and easier to perform than moving away from the screen to manipulate the touchpad.
Because they can’t design a screen coating that doesn’t disintegrate without being touched.
My 2012 MBP is now on it’s 4th screen (at least Apple recognised they had an issue). The original and 1st replacement both started to lose their coating after about 1.5 years, the 2nd replacement had a bright spot so that went straight back. The 3rd replacement’s been going 6 months so far…
Quite good actually, for a 5.5 year old laptop, the screen is 6 months old, the [battery, keyboard, touchpad and upper case] assembly is 1 year old. Just the bottom that’s falling apart, the feet aren’t glued on but somehow plastic welded to the bottom panel, which doesn’t last forever.
The problem there might actually be the user.
yes, and Apple did recall from the goodness of their heart, not because it was another manufacturing defect ….
https://www.macrumors.com/2017/02/24/apple-extended-anti-reflective-repair-program/
Sorry, but as the user I keep mine and everybody else’s fingers well away form the screen. The sleek design of the machine, means that the screen comes in very slight contact with the keys when closed, such that you see outlines of the keys on the screen over time. These are just grease marks from the keys which can be wiped off. The disappearing coating in now way had a pattern matching the keys when it began, although over time the key marks did contributed.
Oddly in the wild, despite asking my boss whether anybody else had reported this issue (possible other’s don’t care – the amount of other MBPs I see here at work that are absolutely covered in fingerprints means they probably wouldn’t notice/care if it did happen). Last week I was at a meeting and just happened to be sat next to someone who when I glanced at their screen instantly saw they had this issue. That’s the only other time I’ve seen it…
As for the replacement keyboard etc, I was hinting that when it’s time to replace the battery after 4 years, you get a ‘free’ keyboard/touchpad/upper chassis thrown in since the battery is well glued in to the upper chassis.
Ha I had a pain to found it back but this is the commercial version:
https://www.indiegogo.com/projects/cozytap-next-generation-computer-interface#/
But with 39 to 49$ ridiculously expensive, for what is essentially a mirror in some plastic.
But you’re paying $1 for the injected moulded mirror and the rest for the software, which I note isn’t included (at the time of commenting) at the link above.
Code was published on github: https://github.com/bijection/sistine been around a couple of years it seems.
wow. I did not know this!
Can you give a breakdown of the costs involved and risks taken? I think many people seriously underestimate what goes into releasing a proper product. A product that actually works in various conditions, not just a pet project with all kinds of quirks people just have to deal with.
If there’s any doubt, I think I just have to point at Kickstarter to see how hard and costly it is to actually get something decent going.
“A mouse is far“ Like he doesn’t have a trackpad right under his hands
Was hoping I could refit my old Powerbook with touch screen, but the touch screen requires camera and early Powerbooks (basically any 680×0 models) didn’t come with any.
USB webcam
Even better! (I think) Because the USB camera could be placed further back from the screen to allow better definition of location as opposed to the limit of finding the “depth” of the touch (i.e. the distance the touch occurs from the mirror right next to the screen).
USB wasn’t offered on any 680×0 line.
You could find an Apple Quicktake camera, from 1994. But I don’t think it has enough resolution, but it connects via the serial port. Kind of large to mount on the screen, likely need a separate tripod. I’m not sure it works like a webcam, offering continuous updates. The only way to get pictures off it is via the serial port. And is the CPU good enough to interpret the pictures from the camera as well as do the main work?
But it reminds me, didn’t Doug Englebart do some work about using cameras for this sort of thing? It was the same time as he started playing with mouses.
Michael
Wow, very, very clever!
1++
sooo…. 2
No, “error: expression is not assignable”.
error: expected ‘;’ before ‘}’ token
i would poke at the screen with a stick.
It would cost notebook manufacturers probably not much more than $1 to add a retractable mirror to the already built-in camera.
And the software, given the quantities, would cost next to nothing.
I wonder in someone will do it.
But talking of touch screens, I have a HANNS-G HT271HPB monitor, 27″” with touch, and while it works very well with Android, I have to say that neither Windows 10 nor Linux are really ready for touch only operation.
They might decide not to do it just because they cant patent it…
Bingo!
No Profit.
No Do.
You got it.
Even if it was something to save lives.
Apple would not do it.
See above. Apple would not do it because vertical touchscreens are a terrible user experience.
nsayer: “Apple would not do it because vertical touchscreens are a terrible user experience.”
They’d do a small modification, and use the webcam to detect keypresses on a totally flat, unresponsive keyboard, the better to make a macbook that’s thinner than anyone really wants.
“virtual” keyboards are already so much of a think they have their own Wikipedia article.
https://en.wikipedia.org/wiki/Projection_keyboard
And they’re also a horrendous user experience. You just have to listen to all the people kvetching about the touch bar on the rMBP to understand how that could be.
nsayer: “And they’re also a horrendous user experience. ”
Exactly my point. Jony Ive’s fetish for thin and light shall not be stopped by mere concerns about usability.
Windows 10 is amazing for touch… not so much the operating system and being a touch only interface… that is still clunky… but software that allows me to mark up drawings, take notes, and whiteboard remotely makes it way more powerful than anything I can do on my android tablet ( which has a pen as well)
It would probably cost even less to include a low resolution touch interface to the screen.
Double win.
Since many people are paranoid about the camera on their laptop and you can buy shutters for them that stick over, said flap would be designed such that it can cover the camera, angle or be fully open.
Stops Apple from using the telescreen and completing your 1984 experience.
You have my vote. Kills several birds with one stone :)
You could even make the mirror function as physical cover for the camera if not in use to prevent any possible spy from using the camera against you. It just needs 3 positions: fully down (close), 45° down, still no view of the user (touchscreen) and fully up, open the camera for use.
Anyone know if there’s a 3D printable design available, otherwise i’ll have todo it :D
But this basicly means that any modern laptop would have the same capability.
So you’re lumping teachers in with butchers, snipers, and torturers? What the heck happened to *you* in school?
What? You were privately tutored?
B^)
My HP “Convertible” laptop/tablet was $25.00 used.. The touch screen is a Wacom tablet, comes with stylus.
Intel i5, 80 gig SSD, webcam..4 gig ram (DDR3)
Not bad, eh?
I’m just a little concerned about repeatedly touching a screen that was not designed to be repeatedly touched. Otherwise, impressive design and build.
But is there a real difference? I don’t know, but glass is glass.
There were lightpens decades ho, you had to hold the tip against the screen.
HP had the HP-150 computer around 1984, using a touch screen instead of a mouse. I once dragged home an RGB monitor that maybe had touchscreen, I think it used sensors around the bezel, so the CRT was “normal” unless they specified it for constant touching.
Michael
Well, there is 1cm thick glass and 0.1mm thick glass. So glass is not always glass.
the first touchscreen i saw was mounted to an IBM PCjr. there’s a 1985 advert in PC Mag for the ‘Soft-Touch’ (uses infrared in a bezel, hooked to a card inside PC). “The user can program applications for the Soft-Touch in BASIC or assembly languages” it says.
IBM also made the IBM 8516 for the PS/2 line.
Though gorilla arm was an issue during long term use which was also was a problem with light pens.
I’m sure you could get a screen protector or “privacy film” that would help.
The torture to animals ought to be over before the butcher shows up
But that’s less fun.
This would be really cool adapted for a Magic Mirror (the one way mirror project with a screen behind so it can show you notifications)
THAT is the thing that came to mi mind when I read the bit about “gorilla arm”. A touchscreen IS uncorfontable to use for long periods of time (I have one), but this would be useful and cheap to add to a magic mirror.
Cool idea! Now I hope for a generalized open source version. I mean it looks like any computer screen or, more generally, any reflective flat surface can be used for touch input with the underlying method. Would be real nice to have a package that works with the Raspberry Pi camera.
Apple would not do a touch screen because it would drive the cost up too much. They pay careful attention to the price-sensitivity of their customers.
Apple has been doing touch screens on its devices for over ten years now.
They just don’t do touch screens on their *laptops* (setting aside the touch bar – it’s a *horizontal* touch screen, after all) because, as I’ve mentioned above, it’s a stunningly bad idea.
Apple customers have a “reverse price sensitivity”: Apple stuff has to be extremely overpriced, so the customers can feel more elitist in their golden cage. I don’t buy stuff with the rotting-fruit logo.
Apple could create touchscreen Macs in a way that’d offer benefits without adding hassle and sell quite a bit more iPads to boot. All they need to do is come up with a way to add iOS apps that communicate with Mac apps, allowing what’s displayed and touched on an iPad to control a Mac app. Think of Mac sound, picture or video apps whose controls are on an iPad and adjusted by touch.
Being done right now as we speak, according to an article I saw yesterday a out the next gen Mac pro.
I wouldn’t mind being able to use AirPlay to turn an iPad into an additional screen on occasion, but that’s about as far as I can see that going.
The Anish is a very thrifty and clever people… person. +1
I have a matte display.
“I don’t like it, so you can’t have it.” Meh.