Computers and Technology

Apple: 11 Things We’d Like to See at WWDC



Barron’s was not invited to this week’s Apple (AAPL) developer conference in San Jose, WWDC2018. We used to go, but in recent years, our invitation has been lost in the mail.

That, and watch the live-stream of the keynote with the rest of the world while musing about those things we would love to see at the conference but know we will not.

11 things Barron’s would like to see at WWDC but will not:

1) Kinetic gestures for Apple Watch. We’re big fans of Apple’s wearable technology, and loyal users since version one. Many things could be added, but the most awesome thing we can think of is kinetic gestures, the ability to perform actions by moving one’s appendages while wearing the watch. What if you could raise your hand in a “high five” fashion when physically in someone’s presence, have the watch recognize the person standing in front of you (via a built-in camera in a future Watch edition), and send the payment to their Apple Watch?


2) Magnetic grip. Remember that scene in “Iron Man 3” where Tony Stark’s suit is able to save multiple airline passengers in mid-air by sending a small electrical charge through a person’s hand, so creating a kind of daisy-chain of clasped hands? In similar fashion, using a magnetic field emitted by the Apple Watch, one could hold onto one’s iPhone without actually gripping the phone, by attracting metal elements in the iPhone frame. Of course, unintended consequences with jewelry and silverware would have to be taken into account.

3) Gesture control across the line. It would be nice to see touch-less gesture control of iPhone, iPad, and Mac, via the cameras. Just swipe in mid-air to switch the different “Spaces” on a Mac, say, or to go to the home screen on iPad, without ever touching the glass. This has been explored in peripherals for years, and, of course, it was a feature of Samsung Electronics’s (005930KS) Galaxy smartphones some years back. But those implementations have never been satisfactory. Apple could do it better, to be sure. It would promote “lean back” computing, where you can tilt in your desk chair and never have to hunch over the device. Plus, fewer greasy smudges.

4) Bezel-less iPad Mini. Readers of this blog know that the iPad Mini is one of the best computing devices ever invented, by Apple or anyone else. It’s not clear it will last, though, as it is increasingly lost between the iPad and the iPhone, and hasn’t been updated in a while. It would be wonderful to see a full-screen iPad mini using the same tiny bezels as the iPhone X. Snowball’s chance of seeing this, this year, or ever.

5) Force Touch 2.0. Force Touch uses “actuator” chips to give some feeling of resistance when you push with a certain amount of pressure on the screen of the iPhone or Apple Watch. It would be great to extend that illusion of surface pressure to convey textures, for example. Imagine “Force Brail,” in which every word on the screen could project the contours of characters on a web page for someone who’s visually impaired, as they press a finger across the screen. When we were kids, we loved the children’s book Pat The Bunny. It has lots of fun things for kids like petting the fur of the bunny in question. Force touch could be refined to produce screen textures, for the digital equivalent of Pat the Bunny.

6) Time Portal. Remember when Steve Jobs unveiled the Time Machine feature on the Mac in 2007? Not only did it look good, it was a really nice idea for how to zoom back over previous versions of things like documents you wrote and revised. We’d like to see that capability across all the devices, including iPhone. It could be implemented via the iPhone’s Force Touch capability, mentioned in 5), where pressing by a certain amount on the screen would blow a hole in the screen’s user interface, opening up a kind of wormhole into previous versions of something. Imagine the built-in Safari browser where you can press on the screen and zoom backward visually through your browsing history. Extending the concept, one could imagine parallel realities revealed by Force Touch, like a wrinkle or tear in the fabric of time. What would those parallel realities be? Unclear, but it sounds cool.

7) Dispense with the “modal dialogue box.” One of the worst things to come over to the iPhone and iPad from the Mac is that little rectangle that occasionally takes over the entire screen and won’t let you do anything until you address it, called a modal dialogue box. This thing is a little annoying on a Mac, but it becomes positively blood pressure-raising on a smaller screen. For example, when riding underground in the subway, out of range of cellular service, the iPhone may pop up a warning in the Mail application to let you know it can’t fetch any new mail. We knew that already, so having the computer interrupt everything to inform us of this feels like the opposite of artificial intelligence. There has to be a better way to convey such information, and the most obvious alternative would be to have the modal dialogue pop up at the top of the screen via the Notifications panel, maybe with a subtle pulsation to distinguish it from non-critical notifications.

8) Your iPhone is the Touch Bar. We like the “Touch Bar” on the newer Mac Pro computers, that strip of touch-sensitive controls just above the keyboard. But what if you could achieve something similar by using your iPhone as a controller? You tap or swipe on the iPhone screen and it manipulates some action on the Mac. There are vague inklings of this with the “Continuity” feature that lets one copy and paste instantaneously across various Apple devices, so this would be something of an extension of that.


9) 3-D pictures. The best use we can think of for “augmented reality” is to create pictures using the iPhone’s camera that have a knowledge of depth of the scene and will let you explore that depth. You take a picture, and it renders the objects in the scene in accurate depth within the three dimensional scene of the background, rather than as flat things on a 2D surface. You can then “push into” the picture by pressing with Force Touch. As you do so, the foreground objects zoom in size to let you move through the space of the photo. Super useful for photographing real estate, no doubt. For a true 3-D picture, one could circle around the subject of the photo to create a volumetric picture that is immediately stitched together. This could be the “killer app” in 2019 or 2020, perhaps.

10) Inifiniti Screen. The second most-useful way to employ augmented reality is to make the iPhone or iPad a window in to an infinitely large screen. As you move the device through space, the screen shifts what it’s showing to be the next tile in an infinite grid of tiles, like moving a monocle over a television set. By knowing its relative position in space, the iPhone or iPad could let you examine images that are virtually hanging in air at actual size, such as a painting. It’s conceivable this could help the disabled. This kind of gross motor movement of the arm could conceivably be more accommodating than trying to pinch and zoom on a small screen for some people who have difficulty with fine motor control. Combined with the Force Brail approach in 5), a visually impaired person could press down on the screen and hold their finger there as they move the iPhone through space. Another good use would be “virtual racquetball.” By pointing the rear camera at any surface, the iPhone is able to create a virtual ball on the screen that has accurately simulated mass, position, and velocity.” By swinging one’s arm through the air, the IPhone becomes the racquet, hitting the virtual ball as the phone reaches the point in space where it has computed the virtual ball should be. The ball then is knocked forward in space, bouncing realistically off the far wall. Of course, version two could offer virtual tennis between two people each using an iPhone. This could be a super time waster.

11) Iris not Siri: The third-best use for augmented reality would be to give the iPhone’s virtual assistant, Siri, an appearance. In the same way that the “True Depth” camera on the iPhone X can create animated emoji, the camera could capture any likeness and use it as the visual identity of Siri every time you call Siri on the phone. It could be the likeness of a spouse, your child, your boss, or even yourself. Of course, speaking a few lines of text would train the iPhone the tone and speech patterns of the individual in question so that Siri’s voice would sound like their voice and would be perfectly aligned to the facial muscle movement. This is possibly creepy but could be a neat gimmick for the first couple of weeks of use.

Sign up to Review & Preview, a new daily email from Barron’s. Every evening we’ll review the news that moved markets during the day and look ahead to what it means for your portfolio in the morning.



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *