AI Computer Vision & Graphics Machine Learning & Data Science Research

Give Your Apps a New Interface With Neural Style Transfer!

To enable both content creators and end users to seriously restyle their apps’ interfaces while maintaining content detail clarity essential to their usability, researchers from Stanford have proposed ImagineNet, a novel and powerful new tool for interface customisation.

Yelp for food, Feedly for news, WeChat for communication, and TikTok for fun — there are over two million apps to choose from nowadays, and they’ve become an inseparable part of our lives. We spend countless hours on our favourite apps, why not customise that experience?

While most mobile apps have one-size-fits-all graphical user interfaces (GUIs), studies have shown that many users actually enjoy a bit of variety in their GUIs. Candy Crush for example adapts to different styles at different stages of the game, and Line messenger allows users to change the start screen, friends list, and chat screens. However, this flexibility in most cases is limited and user customisations may be lost when the developers change the app.

To enable both content creators and end users to seriously restyle their apps’ interfaces while maintaining content detail clarity essential to their usability, researchers from Stanford have proposed ImagineNet, a novel and powerful new tool for interface customisation.

ImageNet uses a neural style transfer model that enables users to apply for example the style of an artwork to change the visual appearance of a mobile app and its assets — picture an Abstract Expressionist home screen, a Fauvist pizza delivery app, a Cubist Pac-Man video game.

“We imagine a future where users will expect to see beautiful designs in every app they use, and to enjoy variety in design like they would with fashion today,” write the researchers in their paper ImagineNet: Restyling Apps Using Neural Style Transfer.

Style transfer is a computer vision task that extracts the style from a reference image and applies it to an input image. The Stanford paper proposes a neural solution by adding to the original style transfer modeling a new structure component that is computed as the uncentered cross-covariance between features across different layers of a CNN. The new structure links up the style elements chosen for each position across levels, better transferring style to interface assets such as buttons, frames, etc.

Researchers failed to find any previous attempt to apply style transfer to GUIs, and so compared ImagineNet with style transfer algorithms for photorealistic images. However when they tested these style transfer algorithms on the MemoryGame Android app GUI they discovered styling the game using these techniques made it no longer playable.

dn-115.png

By minimizing the squared error in the structure between the style and output images, ImagineNet is able to retain the details of GUIs while transferring the colors and textures of the chosen art style with consistency.

ImagineNet is designed to re-style basically all types of apps, enabling users to give full play to their own creativity and personalize the games they love best.

The paper ImagineNet: Restyling Apps Using Neural Style Transfer is on arXiv.


Author: Yuan Yuan | Editor: Michael Sarazen

0 comments on “Give Your Apps a New Interface With Neural Style Transfer!

Leave a Reply

Your email address will not be published.

%d bloggers like this: