The single most irritating killed feature from Apple. Redesign half of their UI to rely on 3D Touch to make sense, then get rid of 3D Touch without redesigning the UI. Previewing links, moving the cursor, interacting with items, they’re all “press and hold until haptic feedback” instead of “quickly press hard and get immediate feedback.” Easier to accidentally trigger, slower to trigger on purpose.
Hardware cost+extra weight (need to make the glass thicker to be able to handle extra force and not push on the display). Turns out nobody was really using it because discoverability sucked..
Hardware cost & weight, fine. Glass doesn't need to be thicker than it currently is (I can press on my 13 Pro's screen about twice as hard as was needed for 3D Touch's max depth, and no issues with the screen), and the last time I replaced a battery on a 12, the screen was just as thick as the XS.
>Turns out nobody was really using it because discoverability sucked..
Sure, but then redesign the UI after removing 3D Touch to not be equally undiscoverable but less precise. Even on the latest iOS beta with its full redesign, there's still many, many actions that require a long press that are completely undiscoverable. (For example, if you don't have the Shazam app installed, go find the list of songs Siri has recognized when asked "What's this song?" Don't look up the answer.)
> Glass doesn't need to be thicker than it currently is (I can press on my 13 Pro's screen about twice as hard as was needed for 3D Touch's max depth, and no issues with the screen)
I dont think this is a great argument. The glass maybe needs to be thicker so the sensors on the border can properly measure the pressure, not because the screen is close to shattering.
He is capable of pressing twice as hard as the feature required at maximum. The screen handles 2x the maximum without issues. Therefore, the glass is thick enough to handle half that pressure,as required by the feature.
As far as I know, the pressure is measured around the edge of the screen. If the screen is thin enough, it could bend when pressed and the pressure applied to the center of the screen can’t be properly measured. I don’t think the problem with a too thin screen is the screen breaking when pressing it.
3D Touch was amazing for typing alone, I miss it basically every day when I type more than a couple of words on my phone. It was so great to be able to firm-press and slide to move the insertion point, or firmer press to select a word or create a selection. It was like a stripped down mobile version of the kind of write-and-edit flow of jumping around between words that I can get on a proper keyboard with Emacs keybindings drilled into my brain.
You can still move the cursor by long pressing on the space bar, in case you didn't know. There's no equivalent replacement for the selection behavior you're describing, though (as far as I'm aware).
The discoverability sucked because Apple never rolled this out to all of the devices, themselves grossly under utilized the feature and eventually ghosted it.
It was by far the best cursor control paradigm on iOS. Now everything is long press which is slow and as error prone.
I’m all for proposing different paradigms as accessibility but 3dtouch was awesome.
I don't like it when old people are the reason the rest of us can't have nice things. Some grandma in Nebraska can't use 3D touch and now the rest of the demographic of Apple's customers are deprived of it.
When I had an iPhone XS i could never understand how to predictably do a normal touch or a 3d touch, or where exactly the OS has different actions for one vs the other.
And I play games [1] using just my macbook pro's trackpad...
[1] For example, Minecraft works perfectly without a mouse. So does Path of Exile. First person shooters ofc don't.
There was a principle of UI design that all UI actions should be discoverable, either with a visible button or a menu item in the menus at the top of the screen (or window on Windows). This is annoying for power users and frequently used actions, so those can also be made available with keyboard shortcuts or right-click actions or what have you, but they must always be optional. This allows power users to be power users without impacting usability for novices.
We've been losing this idea recently, especially in mobile UIs where there's a lot of functionality, not much space to put it in, and no equivalent of the menu bar.
You can use any phone with a barometer to make a scale. All iPhones since the 6, and all the Pixels, and Samsung flagships have one. You get a zip loc bag, blow some air into it, put your phone in running an app that shows the pressure in a big font (so you can see it through the ziploc). Then you put an object of known weight on it like a quarter (balanced carefully on top of the air-filled ziploc) and note the pressure change on the display. With that, I think the weight / pressure change scales linearly, so you can now weigh anything small that you can balance on the ziploc.
The infamous Dropbox comment[0] actually didn't even cite rsync; it recommended getting a remote FTP account, using curlftpfs to mount it locally, and then using SVN or CVS to get versioning support.
The double irony of that comment is that pretty much all of those technologies listed are obsolete now while Dropbox is still going strong: FTP has been mostly replaced with SFTP and rsync due to its lack of encryption and difficult to manage network architecture, direct mounting of remote hosts still happens but it's more typical in my experience to have local copies of everything that are then synced up with the remote host to provide redundancy, and CVS and SVN have been pretty much completely replaced with Git outside of some specialist and legacy use cases.
The "evaluating new products" xkcd[1] is extremely relevant, as is the continued ultra-success of Apple: developing new technologies, and then turning around and marketing those technologies to people who aren't already in this field working on them are effectively two completely different business models.
no affiliation whatsoever but the app PHYPHOX has access to basically all of your iPhone sensors and can show the information in real time and save it, even has the capability of running a local python server so you can access it from a web browser on the same network or tethered device.
I think this is neat, but only in a Rube Goldberg machine sort of way. The instructions are:
1. Open the scale
2. Rest your finger on the trackpad
3. While mainting finger contact, put your object on the trackpad
4. Try and put as little pressure on the trackpad while still maintaining contact. This is the weight of your object
That is, the pressure sensors only work if it detects capacitance, so you need to be touching the track pad (but not too much!!) while weighing something.
Could a small piece of conductive foam or some cleverly layered tin foil+paper work? So put the object on the shim (which has a known or even negligeable weight)
I once put some aluminum duct tape completely over the touch pad of an old laptop to see what would happen. Turns out it induced enough "eddy currents" to make the mouse move around the screen without me touching it--in a way, visualizing the currents!
I connected the foil to ground using a small strip of the tape to the ground metal of a USB port on the side and it disabled the touch pad.
I remember drawing on my old iPad back in the day by shoving a wet q-tip into a BIC pen and using it as a stylus. I am sure something similar could be rigged here
Nice. No, I preemptively armed myself with a carrot before taking the dog for a walk in cold weather.
I only had a non-stylus smartphone for a year and a half before whimpering back to the Note series. It's what keeps me in the Samsung sphere of influence.
Ever try putting gloves back on when your hands and the gloves are both wet? This is why I print recipes on the laser, and just take the paper version downstairs.
> TrackWeight utilizes the Open Multi-Touch Support library by Takuto Nakamura to gain private access to all mouse and trackpad events on macOS. This library provides detailed touch data including pressure readings that are normally inaccessible to standard applications.
How can something be available as a library but not as a native interface? Swift does not expose that API?
Mac OS has "Private Frameworks" - shared libraries that are used by the system but don't ship with headers by default. It's trivial to produce these headers from the libraries, and then make wrappers for them like OpenMultitouchSupport which is a wrapper for MultitouchSupport.framework.
I wrote that software, called SeisMac. Someone figured out the Apple-private API for the Sudden Motion Sensor that parks your laptop's hard drive if it detects free-fall. Working from that, I wrote a free app that used the API to show three-axis acceleration graphs. I was proudest of the calibration utility, which had you tip your laptop on its side (with properly rotated dialogs!), and then on its screen.
People would send me recordings from all over the world (e.g. on a ship in the Drake Passage showing enormous surges). It was a lot of fun, and I even got an educational grant to improve it.
Big bummer when Apple switched to solid-state drives (well, a bummer for my one small reason...)
I used an iPhone as an air pressure recorder. There's an app for that; many actually. Anyways, the trunk gate on my car wasn't sealing and when it went over pavement joints on the highway it would slightly open and then close in quick succession which was nauseating. I showed the data to Tesla service and they (grumbled and) readjusted the trunk gate. The problem disappeared.
I heard that IBM decided to move out of this building [1] because vibration due to the construction of the tower across the street kept destroying hard drives in their computing center.
Reminds me of the people who used their ThinkPad's vibration sensor to detect smacks on the machine, and rigged their X window manager to switch virtual desktops when smacked from the appropriate side, panning right when smacked on the left, and left when smacked on the right.
Have you done any testing to determine how precise and accurate this is? I suspect their must be a lot of variance between laptops, since this isn’t an intended use case.
I would assume Apple hardware comes precalibrated. Homogeneity is everything for their product lines, down to individual calibration of screens and audio hardware. It would be weird to get a new laptop and have its trackpad feel different.
> I suspect their must be a lot of variance between laptops, since this isn’t an intended use case.
Yeah and so it is for ordinary strain gauges aka load cells. You can either use a 2 point calibration (aka no load followed by known load) or if you want more precision a 3 point calibration.
Somebody could use this as a starting point. http://touchscale.co/ You'd have to collect new data on touch strength vs. weight to get the regression parameters.
(If you do this, let me know and I can add it to the site above, and then we can both delight in the surprisingly large amount of unmonetizable traffic it gets.)
By "downloaded," I expect that you mean "Built, tested, and deployed." It's not an App Store app. It's basically a technology demo. Get Xcode, and build it and run it.
You could always request that from the author. Since it's a Mac app, they could do that. Not so, if it were an iOS app.
It's a pretty basic SwiftUI app. They haven't really polished it, so I could see why they might not be interested in making it much more accessible. It's a tool for Mac geeks.
Speaking for myself, I have a whole bunch of packages, and almost every one has a test harness. Many of the test harnesses are "full-fat" iOS apps, so they can't be provided as releases, unless I create an App Store app for each one.
They need to be built and run. A couple are Mac apps, but the whole deal with them, is that they are test harnesses, so divorcing them from the IDE is sort of negating their purpose. They are meant to help other Apple developers to understand and use the packages the apps are associated with.
https://www.theverge.com/2015/10/28/9625340/iphone-6s-gravit...
>Turns out nobody was really using it because discoverability sucked..
Sure, but then redesign the UI after removing 3D Touch to not be equally undiscoverable but less precise. Even on the latest iOS beta with its full redesign, there's still many, many actions that require a long press that are completely undiscoverable. (For example, if you don't have the Shazam app installed, go find the list of songs Siri has recognized when asked "What's this song?" Don't look up the answer.)
I dont think this is a great argument. The glass maybe needs to be thicker so the sensors on the border can properly measure the pressure, not because the screen is close to shattering.
He is capable of pressing twice as hard as the feature required at maximum. The screen handles 2x the maximum without issues. Therefore, the glass is thick enough to handle half that pressure,as required by the feature.
It's a good argument.
It was by far the best cursor control paradigm on iOS. Now everything is long press which is slow and as error prone.
I’m all for proposing different paradigms as accessibility but 3dtouch was awesome.
Have no idea why you’d go out of your way to do that other than placating image sharing services
And I play games [1] using just my macbook pro's trackpad...
[1] For example, Minecraft works perfectly without a mouse. So does Path of Exile. First person shooters ofc don't.
We've been losing this idea recently, especially in mobile UIs where there's a lot of functionality, not much space to put it in, and no equivalent of the menu bar.
The double irony of that comment is that pretty much all of those technologies listed are obsolete now while Dropbox is still going strong: FTP has been mostly replaced with SFTP and rsync due to its lack of encryption and difficult to manage network architecture, direct mounting of remote hosts still happens but it's more typical in my experience to have local copies of everything that are then synced up with the remote host to provide redundancy, and CVS and SVN have been pretty much completely replaced with Git outside of some specialist and legacy use cases.
The "evaluating new products" xkcd[1] is extremely relevant, as is the continued ultra-success of Apple: developing new technologies, and then turning around and marketing those technologies to people who aren't already in this field working on them are effectively two completely different business models.
[0]: https://news.ycombinator.com/item?id=9224 [1]: https://xkcd.com/1497/
[1]: https://github.com/tszheichoi/awesome-sensor-logger
1. Open the scale
2. Rest your finger on the trackpad
3. While mainting finger contact, put your object on the trackpad
4. Try and put as little pressure on the trackpad while still maintaining contact. This is the weight of your object
That is, the pressure sensors only work if it detects capacitance, so you need to be touching the track pad (but not too much!!) while weighing something.
I connected the foil to ground using a small strip of the tape to the ground metal of a USB port on the side and it disabled the touch pad.
It's the reason why I love Note and S Ultra phones - the stylus. I'm using it now.
I only had a non-stylus smartphone for a year and a half before whimpering back to the Note series. It's what keeps me in the Samsung sphere of influence.
How can something be available as a library but not as a native interface? Swift does not expose that API?
https://allthegooddomainsweretaken.justinmiller.io/2007/04/0...
People would send me recordings from all over the world (e.g. on a ship in the Drake Passage showing enormous surges). It was a lot of fun, and I even got an educational grant to improve it.
Big bummer when Apple switched to solid-state drives (well, a bummer for my one small reason...)
[0]: https://en.wikipedia.org/wiki/Sudden_Motion_Sensor
[1] https://en.wikipedia.org/wiki/330_North_Wabash
Have you done any testing to determine how precise and accurate this is? I suspect their must be a lot of variance between laptops, since this isn’t an intended use case.
I wonder if that affects this app at all.
Yeah and so it is for ordinary strain gauges aka load cells. You can either use a 2 point calibration (aka no load followed by known load) or if you want more precision a 3 point calibration.
[1] https://en.wikipedia.org/wiki/Load_cell
* Not legal for trade outside of Ankh-Morpork.
(If you do this, let me know and I can add it to the site above, and then we can both delight in the surprisingly large amount of unmonetizable traffic it gets.)
It's a pretty basic SwiftUI app. They haven't really polished it, so I could see why they might not be interested in making it much more accessible. It's a tool for Mac geeks.
Speaking for myself, I have a whole bunch of packages, and almost every one has a test harness. Many of the test harnesses are "full-fat" iOS apps, so they can't be provided as releases, unless I create an App Store app for each one.
They need to be built and run. A couple are Mac apps, but the whole deal with them, is that they are test harnesses, so divorcing them from the IDE is sort of negating their purpose. They are meant to help other Apple developers to understand and use the packages the apps are associated with.
[0] https://en.m.wikipedia.org/wiki/Macintosh_Portable
Settings -> Desktop & Dock -> Hot corners -> set one to "Disable Screen Saver"
Then just shove your cursor into that corner whenever you want to leave your computer without it sleeping
Type `caffeinate` into the terminal, your mac will stay awake until you ctrl+c it
I remember trying a bunch until finding some combination that worked, and looking around it sounds like it's still a common issue:
https://apple.stackexchange.com/questions/475477/caffeinate-...