Apps for iWatch and iOS 8 Extensions

August 22, 2014

Edge Cases had an episode on “Apps for iWatch and Apple TV”, which got me to thinking how Apple might really make use of a wearable device and how it might work.  

Here’s my prediction:

The iWatch, or whatever Apple will call it, will be an accessory to an iOS device.  It may have limited functionality without a paired iOS device (a watch!), but when paired with an iOS device, it will become a a miniature input device.  Software will be written for iWatch in the form of iOS 8 Extensions.  If one accepts the premise that the iWatch will be an accessory to the iOS device, then it logically follows that programmability will be an accessory function to an iOS app.

As an example, consider how the “today” extension works:  As part of an iOS app, a Today extension allows that app to present very limited content in the Today view of Notification Center.  I predict that iWatch programmability will take the form of an iWatch extension that allows an iOS 8 app to present some limited information in the iWatch display.  The watch won’t need to be very smart — all the CPU power will live on the iOS device — and the iWatch display needn’t be very power-hungry:  even a monochrome LCD will be sufficient for this functionality.

iWatch will also allow the same sort of user input that apps can show on the locked home screen — the best example I can think of is the play/pause / skip forward/skip back control for a podcast app or for iTunes.  This will allow you to use the iWatch as your controller for playing audio without getting the iPhone / iPod / iPad out of your purse or pocket.The sensors required don’t need to be very smart — I imagine the functionality being limited to swipe up / down/ left / right   tap once / twice / thrice — maybe a few different tap zones on the watch for different functions.    it could also be used as a way for a user to trigger text-to-speech on an incoming text or email, again so that one doesn’t need to take the iOS device out to accomplish this.  I could see this being useful while driving, or walking / jogging, or doing yard work or housework — exactly the occasions where I’m usually listening to a podcast.

The advantages I see of this approach are: A) the cost of the iWatch can be lower than if there’s any CPU or power-hungry components.  B) Apple leverages the strength of the existing iOS app store and iOS app ecosystem as a way to get functionality on the iWatch.

Advertisements

Popular decomposition

December 28, 2013

 

 

 

 

Regular Solution Model spinodal decomposition applied to image from popular culture,  temperature = 0.6 Tc.

SpinodalSimpsons

Apex a48 is live

March 13, 2008

The first update in a long time. This includes the planes UI I screencasted a few days ago, and plenty of other new bits in the UI and under the hood. The conversion away from DITL resources over to Interface builder nibs is complete. I’ll be screencasting more in the next few days.

Screencast demo of the new planes feature

March 12, 2008

Here’s a screencast I did showing the new UI for manipulating and exploring planes. There’s a high resolution version on Viddler, which really works great in full screen mode. When embedded in another page, like this YouTube link, its hard to see what’s on the screen.

I’ll be releasing a new version of Apex with this feature in it soon.

An Introduction to Using Apex

March 9, 2008

Here’s a step-by-step intro to some first steps with Apex. I’ll be trying to add content to this document as I am able.

UsingApex.pdf

Moved the Blog to WordPress

March 9, 2008

This blog previously lived here:

http://homepage.mac.com/olof/tomographic/

Where the links should still be active. I’m trying to move the old content over here to WordPress.

A Movie Example

March 9, 2008

Here’s a movie I made to show off some of the 3D graphics of Apex. The rotation sequence is produced from a script — the other bits are screengrabs from Apex with different atoms visible in the display.

Its all thrown together with iMovie with a voiceover I recorded late one night, so I sound a little sleepy.

Improvements in Apex a47

July 23, 2007

A number of improvements since the last update from a few months ago.

All dialogs have moved to .nib format.

QuickTime movie code to use the objective C APIs of QTKit. QTKit will be the supported QuickTime platform going forward for Apple, so its good to move there now.

OpenGL code now uses quite a bit more of the graphics card capability, so OpenGL images draw much more quickly. Also changed the lighting angle a little bit for more consistent colors using OpenGL.

Next will be code for viewing only a fraction of a dataset, so those large datasets are a little more manageable.

The Reconstruction Summit Wrap

February 25, 2007

Well, I’m really late writing this, but a big thank you to Tom Kelly and everyone at Imago for putting together the reconstruction summit. Mineral Point, WI is extremely cold in February, but it was an ideal place to get a small group of scientists together for brainstorming. The Jones Mansion is quite a nice place, too. You can see the room I stayed in on their homepage — in the picture on the left, right through the bright doorway.

The scientific content of the meeting is under “Gordon Conference” style embargo, meaning that none of the
participants are free to discuss the content, but I can say what I took away from it as future work for myself. One of the basic issues with the current data workflow for the LEAP right now is that the raw data files from the LEAP are in a less-than-fully-open format.

That’s not good for researchers, because its hard to tinker with data that’s trapped in a proprietary format. That’s not good for Imago, either, because the last thing Imago wants to spend time and effort on is tinkering with reconstruction algorithms. So there’s a need for some common code to interpret and make accessible the large amounts of data in a raw data from the LEAP. I’m one of the people who volunteered to help maintain a repository of code. We’ll need the cooperation of the folks at Imago, but we’ll see how it goes.

And for the record, I tried cheese curds, but they didn’t squeak.

Test Suite Timing Comparisons

January 18, 2007

Part of the release procedure for Apex is that I run the test suite on the compiled app. ‘The test suite’ is currently nine AppleScripts which run Apex through a lot of its paces — importing and exporting files, calculating a proxigram, saving an animation as a QuickTime movie, running the select particles in shell action from the isosurfaceOps plugin, and most recently I’ve added an rdf export.

I run the suite once on a PowerPC machine and once on an Intel-based machine. If the test succeeds, its a pretty good indication that the build is OK to go out the door. It doesn’t catch issues in the GUI, and it doesn’t run through all the functionality, but it gets a good deal. For example, the most recent problem I discovered was that one of the plugins was building for PowerPC only because of a glitch in the XCode config files.

Of course, one of the things that I monitor is how long each script takes, so I can verify that I haven’t screwed anything up too much in making changes for each release. And the other thing is that, having an automated test suite means its pretty easy to move the test to new machines. So I’ve done a little comparison of performance on a few different machines. From slowest to fastest:

G4 PowerBook , 1.25 GHz: 508 seconds
G4 PowerMac, Dual 1.25 GHz: 462 seconds
G5 PowerMac, Dual 2 GHz: 250 seconds
iMac, 1.83GHz Intel Core 2 Duo: 209 seconds
MacBook, 2 GHz Core Duo: 192 seconds
MacPro, Dual 2.66GHz Xeon: 133 seconds

These are all running 10.4.8. All in all, its about what you would expect, but I must say I’m very impressed by the MacBook.