Categories
Augmentation Futures Wearables

Screens. Lots of screens.

I’ve wanted to be able to do this for a long time, and now we can. Here’s a video of my screen a minute or so ago​*​.

If you have an Oculus Quest, and install the Immersed app you can put your mac’s displays into a VR world, and then add more virtual monitors, and place them in space around you. This is slight bonkers. But not as bonkers as the fact that you can still use Zoom.

But then again, Immersed also lets you put yourself into a shared VR co-working space. So, you know, move that one from the future column over to the present day.


  1. ​*​
    Extra points to the people/cult members who spot the other thing hanging out in this video, and know of its brilliance. Because *damn* it’s good.
Categories
Augmentation Futures Wearables Writing

ReadyBrek, AR, and Flirting in the age of Corona

If you grew up in the UK in the 1980s, you’ll know this image. It’s from a TV advert for a breakfast cereal called Ready Brek. It’s porridge, but much more finely milled, so it’s ready to eat as soon as you add the milk. It is unredeemingly grim, as are their adverts. I mean, I’m not saying that this part of my childhood is why I’ve spent most of my adulthood in Mediterranean climates, but watch this:

If this makes you nostaligic, you’re insane.

Anyway, the red outline to the child, denoting a belly full of carbohydrate mush to fuel their way through the Thatcherite dystopia, is actually a pretty good bit of user interface when applied to the present day unpleasantness.

We know that Apple have AR glasses coming to market in the next year or so. From this sort of reporting and this and this. And despite the social nightmare of the forward facing camera that contributed to the Google Glass fiasco, AR glasses rely on forward facing sensors of some form. It’d be easy – trivial, in fact – to overlay a social distancing guide to your field of view. And maybe a bit trickier, but still thereabouts, for a forward facing IR camera to overlay anyone running an obvious fever with a ReadyBrek outline.

Apple, of course, have a patent for technology that would do this very thing​*​. https://patents.google.com/patent/WO2019067650A1/

The social practices that might come from exposing feverishness without contact, or from actually giving a genuine measure of the six-foot social distancing radius, are interesting to consider. After it becomes a matter of etiquette and good social graces to keep a distance, the violation of that boundary can be either an act of extreme aggression, or one of subtle intimacy: the On peut se tutoyer of public space. Or perhaps something more flirtatious. After all, fans, which given the unpleasantness of wearing a facemask in the summer, are due a comeback, have a long tradition of signaling and seductive language. Tiny breaches of the 6 foot boundary will take on their own greater meanings, while the unveiling of the lower half of the face becomes the most private of moments.


  1. ​*​
    One of the dark secrets of the Futurist trade is that people generally announce their plans well in advance. Whether in patent applications or political manifestos, most people are quite open about what they’re going to do – even people who you might think would want to keep their Evil Plan secret. At a rough guess, 99.5% of all punditry is just special pleading around this: “Surely they won’t do what they’re plainly said they will?” Yes, yes they will.
Categories
Augmentation Complex Systems Wearables

Right Effort, Right Interface

So here’s irony. If I stare straight ahead right now, all I can see are screens. My new desk, with my new monitor setup, takes up almost all of my field of view, and yet, as luxurious as this is, it’s lead me to hours at a time away from screens altogether.

Here’s the thing: while for some types of work, two (or, swoon, three) big monitors and some associated ergonomically enhancing desk set up would seem to be the thing, (see Multiple Screens and Devices in an Information-rich Environment, for example), I’m profoundly loathe to sit at a desk all day. I live in Southern California, and there’s sunshine I don’t want to miss out on.

Moreover the years of following a GTD practice have lead me to realize the power of breaking my work up into chunks of, so to say, right-thing, right-place, right time. Being able to escape one form of computing, and with it, the form of thinking that is implied and enforced by those tools, is a powerful technique.

At the same time, as a student of Zen Buddhism, I’m trying to pay attention, in a very active sense, to what I do as a daily practice in all parts of my life. Right now, it’s the Spring Practice Period at the San Francisco Zen Center, (I’m attending online from down here in LA), and the theme this season is Wise Effort. The associated meditations have me thinking on this anew, and from there, in part, on how much time I spend on my phone.

Every morning, for example, I take my daughter to school, and we stop off for coffee/steamed milk on the way. Too often have I caught myself engaging with Twitter rather than with her during those brief minutes, and that doesn’t require exquisite Zen training to uncover as not-ideal for either of us.

So for the past few weeks, I’ve been wrapping my phone up in a cloth, furoshiki-style, and putting it into my bag. The added friction of getting it out, unwrapping it, turning it back on, et-connecting-cetera, takes away the quick hit of checking notifications, and the inevitable in-suck from there. It works well.

So far, so digital-detox middle-aged-angst. But it has lead to something new.


I need to work on my 風呂敷 technique.

I need to work on my 風呂敷 technique.

Leaving my phone wrapped up is anxiety-driving, because while I know my own work schedule well enough, there’s always an ego-driven part of the brain that thinks I’m an air-traffic controller with a part-time heart surgery practice. Those neurons want me to check my inbox on a regular basis, just to make sure I’m not being called into action for something that has to happen right now now now. By, you know, something on Twitter.  ¯\_(ツ)_/¯

This is bollocks, obviously, but it’s nonetheless an itch that needs to be scratched. Combine it with my podcast-listening habit, and you have two major drives pulling on me to unwrap the thing, so to pull it out every few minutes.

But I don’t. I’ve defaulted to my watch as my primary mobile platform. Podcasts I can get, after installing Outcast for Apple Watch, and the rest of my super urgent messages and pushes can be relied on to come through just fine. (Which is to say, never, as I’m not a coastguard).

Thus becalmed, my brain can get on with other stuff. And here’s the new thing. As I mentioned on Twitter yesterday as part of a thread started by the splendid @hondanhon by defaulting to a new, tiny, platform, I’m forced to use its features more deeply. Turns out, they’re really good: 

Many of my work tasks, it seems, are much better suited by combining the pens and paper of my choice with Siri-based interaction with my watch via AirPods. While my desktop machine has a whole monitor dedicated to my inboxes, Omnifocus, and calendar, and my weekly review is for sure a multiscreen activity, if I’m planning a talk, for example, I just need to send quick messages and drop reminders to myself. I can do that, and even some pretty nice long-form dictation into Evernote, from Siri. I’m both connected to my IT-ecosystem, and pleasingly untethered from it.

Is it the perfect platform for the entire day? No. But it does speak to an interesting trend. As my phone gets ever more powerful, there’s sometimes a feeling that it’s too powerful to be brought out without proper psychological preparation. It’s too moreish a device to just have in my pocket, and most of the time I have no operational need for it to be quick-draw holstered there either.

And also, voice interfaces are pretty good – the household menagerie of smart speakers speak to that already – and offer a new way of interacting that doesn’t require eyes or fingers. In the kitchen, or the bedroom, a quiet work to an Alexa is almost always cognitively appropriate to the computational task at hand.

That’s what I suppose I’m searching for here: matching an appropriateness of interface to the task, rather than matching my tasks to the interface I have in front of me: deciding what to do based on priorities other than what technology I am sat with. This could involve breaking habits I didn’t know I had. Or, at least, having to become aware of how much my available toolset shapes my thinking. Let’s see. Onwards.

Categories
Augmentation Futures Wearables

Possible Problems of Persona Politeness

One of my AIs is funnier than the other. This is proving to be a problem.

But first, consider how the amazing becomes normal very quickly. It feels like I’ve been using Siri on my phone my entire life, Siri on the iPad charging by my bed since forever, and Siri on my watch since last summer. I’ve not, of course. She’s only four years old this October. But nevertheless, as with any new life-spanning tech, she’s become background-banal, in a good way, remarkably quickly. Voice interfaces are, without a doubt, A Thing.

And so it is with Alexa, the persona of the Amazon Echo, living in my kitchen for the past fortnight. She became a completely integrated part of family life almost immediately. Walking into the kitchen in the morning, ten-month-old daughter in one hand, making my wife tea with the other, I can turn on the lights, listen to the latest news from the radio, check my diary, and order more milk, just by speaking aloud, then turn it all off again as I leave. It’s a technological sprezzatura sequence that never fails to make me smile. Thanks, Alexa, I say. Good morning.

But there’s the rub. Alexa doesn’t acknowledge my thanks. There’s no banter, no trill of mutual appreciation, no silly little, “it is you who must be thanked” line. She just sits there sullenly, silently, ignoring my pleasantries. 

And this is starting to feel weird, and makes me wonder if there’s an uncanny valley for politeness. Not one based on listening comprehension, or natural language parsing, but one based on the little rituals of social interaction. If I ask a person, say, what the weather is going to be, and they answer, I thank them, and they reply back to that thanks, and we part happy. If I ask Alexa what the weather is, and thank her, she ignores my thanks. I feel, insanely but even so, snubbed. Or worse, that I’ve snubbed her.

It’s a little wrinkle in what is really a miraculous device, but it’s a serious thing: The Amazon Echo differs from Siri in that it’s a communally available service. Interactions with Alexa are available to, and obvious to, everyone in the house, and my inability to be polite with her has a knock-on effect. My daughter is too young to speak yet, but she does see and hear all of our interactions with Alexa. I worry what sort of precedent we are setting for her, in terms of her own future interactions with bots and AIs as well as with people, if she hears me being forced into impolite conversations because of the limitations of her household AI’s interface. It’s the computing equivilent of being rude to waitresses. We shouldn’t allow it, and certainly not by lack of design. Worries about toddler screen time are nothing, compared to future worries about not inadvertently teaching your child to be rude to robots. 

It’s not an outlandish thought. I, myself, am already starting to distinguish between the personalities of the different bots in my life. Phone Siri is funnier than Watch Siri; Slackbot is cheeky, and might need reeducating; iPad Siri seems shy and isolated. From these personalities, from these interactions, we’ll take our cues. Not only in how to interact, but when and where, and what we can do together. If Watch Siri was funnier, I’d talk to her more. If Phone Siri was more pre-emptive, our relationship might change. And it’s in the little, non-critical interactions that their character comes through.

All this is, of course, easier said than done by someone who isn’t a member of the Amazon design team – hey there lab126, beers on me if you’re in LA soon – but there’s definitely interesting scope to grow for the seemingly extraneous stuff that actually makes all the difference. Personality Design for AIs. That’s a fun playground. Is anyone playing there?

Categories
Augmentation Futures Wearables

The Internet of Tells: Constant biomonitoring and some uses

In poker, they call them tells. The little physical signs that we can’t control that give away our inner mental state. What happens if we make these privately machine-readable?

For me, a lot of the fun of future technologies isn’t new tech per se, but the coming together of three or four older things, refined by new physical capabilities and design understandings, to push over the Hill of Single Use into a new valley of possible products. A strained metaphor, perhaps, so let me give you an example. Heart rate monitors have been around for years. I’ve been running with one strapped to my chest for at least a decade myself, and in those days the data has been restricted to its one single device (and later to a single app, barring the export of averages and such very high-level takes). You certainly didn’t wear an HR monitor all the time, and even if you did, you couldn’t use what it saw for anything other than athletic training.

But 2015 will see at least two products come to mass-market that might do such a thing. The Jawbone3, and the Apple Watch

The back of the Apple Watch, showing the HR monitor

The Apple Watch has an HR monitor on its back, has local processing, a data connection (and through that, infinite cloud processing) – but more than that, it has access to everything else we might do digitally, Not just publishing capability (send my HR to Facebook, Tweet when I go over 180, and so on) but a form of sense-making too. The complex network around the Apple Watch knows an awful lot about your personal context – that’s really its point after all – and so it could start to make all sorts of correlations between HR and that context.

We know that changes in HR can reflect changes in psychological state. Your heart beats faster when you’re aroused or stressed or angry. And we now have a device that can notice that tell, and try to work out what is causing it. What might that do? Here are some scenarios, and possible products:

  1. One to One. You regularly meet with someone, Mr X, who drives you insane. A deeply stressful person, who causes your heart to beat hard as you restrain yourself from violence. An asshole of the highest order. Your system detects the increase in heart rate, and sees it happens whenever you have a calendar appointment with Mr X. Matching the appointment data with LinkedIN, it identifies Mr X, and posts the “Meeting with Mr X is stressful” posit to a LinkedIN API-using offshoot, a “Rate My Meeting” clone. Over time, Mr X’s rating is further added to by others’ systems, perhaps without user input at all, flagging Mr X as (algorithmically designated) asshole. The system acts accordingly.
  2. Many to One. You walk to work down Oxford Street, but prefer to slip through side streets if the foot traffic is annoyingly dense. Luckily, the HR monitors on the wrists of tens of Apple Watch wearers already on Oxford Street are spiking higher than they usually average here, at this time of day, with this sort of weather. Your system notices this, and gently nudges you away from the area, pre-emptively avoiding the stress that others are giving away to the network.
  3. Many to Many. You’re at a concert, and having a splendid time. Your HR is rising as the music builds, and from your watch you can see that others in the crowd are feeling it too. The crowd average HR goes past 140…141…144….147…….149……and as soon as it reaches 150,  it triggers the drop, the stage pyros, the lasers, the dancing girls. The musicians onstage, able to reach their musical climax just as the audience reaches theirs. That’s showbusiness.

None of these use-cases, and there are many more, require a new magical technology. Apart from the actual heart-monitoring, you could prototype them today all quite (handwaving here) easily. But none of them would work without a good installed base of constantly available HR monitors already in place. That, if Apple and Jawbone and the rest get their way, is what we’re about to have. It’s a whole new product/service category, being unlocked almost by mistake.