Neural Style Transfer: Down the rabbit hole and into space!

Tonight, I tumbled down the rabbit hole and ended up falling into neural style transfer. Neural style transfer uses convolutional neural networks to separate the style from the content of an image. The style and content can be represented as two independent features of an image. Once separated, the style representation of one ‘style’ image can then be merged with the content of a different input image. This synthesis can be shown in the graphic below. The sea turtle is the content image, and Hokusai's the Great Wave is the style image. The synthesized output is the content of the sea turtle but in the style of the Wave.

Source: Colab notebook from - https://research.google.com/seedbank/seed/neural_style_transfer_with_tfkeras

I couldn't help but trying this out on one of my own photos with the Pillars of Creation from the Eagle Nebula from Hubble telescope :

Dustin in space


Other examples of this are shown below:

Germany in space

Germany in Kandinsky?

Space turtle

If you're interested in playing with this technique or about the background research, look at the resources below. The first resource allows you to run this algorithm yourself without knowing anything about machine learning.

1) Google Colab put together a notebook that runs this algorithm at the Colab link on this page -https://research.google.com/seedbank/seed/neural_style_transfer_with_tfkeras

2) A link to the 2015 paper that first described the technique is here: https://arxiv.org/abs/1508.06576. I was pretty pleased to see this paper was only 4 years ago. 4 years may be eons in machine learning, but it still isn't too long ago.

3) I wondered if anyone had made this into an app (because it would be fairly easy to do and I figure people would think it was fun), and of course, someone has. One example is the Pikazo App.


Popular Posts