Spotted: most app launches happen from other apps

Quite related to some of our own work.

***  
 
// published on Human Computer Interaction with Mobile Devices and Services-Latest Proceeding Volume // visit site

Oh app, where art thou?: on app launching habits of smartphone users

Alina Hang, Alexander De Luca, Jonas Hartmann, Heinrich Hussmann

In this paper, we present the results of a four-week real world study on app launches on smartphones. The results show that smartphone users are confident in the way they navigate on their devices, but that there are many opportunities for refinements. Users in our study tended to sort apps based on frequency of use, putting the most frequently used apps in places that they considered fastest to reach. Interestingly, users start most apps from within other apps, followed by the use of the homescreen.

An alternative to pinch-to-zoom

Good work, and about time. If the current ui is the "wimp" of mobiles, hopefully we won't have to wait as long as we did with wimp to find what's next.   
 
// published on Human Computer Interaction with Mobile Devices and Services-Latest Proceeding Volume // visit site

Toward compound navigation tasks on mobiles via spatial manipulation
Michel Pahud, Ken Hinckley, Shamsi Iqbal, Abigail Sellen, Bill Buxton

We contrast the Chameleon Lens, which uses 3D movement of a mobile device held in the nonpreferred hand to support panning and zooming, with the Pinch-Flick-Drag metaphor of directly manipulating the view using multi-touch gestures. Lens-like approaches have significant potential because they can support navigation-selection, navigation-annotation, and other such compound tasks by off-loading navigation to the nonpreferred hand while the preferred hand annotates, marks a location, or draws a path on the screen. Our experimental results show that the Chameleon Lens is significantly slower than Pinch-Flick-Drag for the navigation subtask in isolation.

Find: visualizing how foods "go" together

Not sure nodes and links were the best choice here. 

---

// published on The Verge - All Posts // visit site

PB&J: One chart shows how 381 different foods will taste together

Some tastes just fit together perfectly — but why? This month, Scientific American tackles the question with an interactive chart, combining chemical analysis of 381 ingredients with data from over 50,000 recipes. The red lines indicate a shared chemical compound, like the common sugars between an apple and a glass of white wine. The analysis also unearths less expected links, like a surprising number of shared compounds between soybeans and black tea. It's a matter of taste whether those common compounds actually make the foods taste better together, but there's reason to think they do. The underlying research finds that, in European cuisine at least, chefs tend to pair flavors based on shared chemistry.

Continue reading…

Spotted: a panel on mobile gpus at siggraph, with anandtech's anandtech shimpi

Shimpi and anandtech are based in the triangle. 

---
 
// published on Computer Graphics and Interactive Techniques Conference-Latest Proceeding Volume // visit site

New directions and developments in mobile GPU design

David Blythe, Eric Demers, Barthold Lichtenbelt, James McCombe, Anand Shimpi, Dave Shreiner

Computing is evolving as smartphones and tablets increasingly become primary entertainment devices. This shift requires greater performance from mobile processors to deliver the same quality experiences that a PC or gaming console does, but without compromising battery life in a more compact mobile device form factor. This panel, composed of leading mobile graphics experts from Qualcomm, NVIDIA, Intel, ARM and Imagination Technologies, will cover the newest and best ways advanced programmers can take advantage of GPU design implementation and optimize for mobile-specific platforms. The discussion will cover the latest graphics APIs like OpenGL ES 3.0 for sophisticated graphics programming, compute APIs like OpenCL to enable GPGPU acceleration, and Android APIs like Renderscript for advanced features such as instancing, occlusion queries, superior texture compression formats, and multiple render targets.

Find: Google patents 'pay-per-gaze' eye-tracking that could measure emotional response to real-world ads

Google wants these visual experience signals: gaze and dilation. You knew those gglasses would have ads, right? 

---
  
 // published on The Verge - All Posts // visit site

Google patents 'pay-per-gaze' eye-tracking that could measure emotional response to real-world ads

Advertisers spend heaps of cash on branding, bannering, and product-placing. But does anyone really look at those ads? Google could be betting that advertisers will pay to know whether consumers are actually looking at their billboards, magazine spreads, and online ads. The company was just granted a patent for "pay-per-gaze" advertising, which would employ a Google Glass-like eye sensor in order to identify when consumers are looking at advertisements in the real world and online.

From the patent application, which was filed in May 2011:

Pay per gaze advertising need not be limited to on-line advertisements, but rather can be extended to conventional advertisement media including billboards, magazines, newspapers, and other forms of...