Finally, AI-Based Painting is Here!

Finally, AI-Based Painting is Here!


Dear Fellow Scholars, this is Two Minute Papers
with Károly Zsolnai-Fehér. A few years ago, the Generative Adversarial
Network architecture appeared that contains two neural networks that try to outcompete
each other. It has been used extensively for image generation,
and has become a research subfield of its own. For instance, they can generate faces of people
that don’t exist and much, much more. This is great, we should be grateful to live
in a time when breakthroughs like this happen in AI research. However, we should also note that artists
usually have a vision of the work that they would like to create, and instead of just
getting a deluge of new images, most of them would prefer to have some sort of artistic
control over the results. This work offers something that they call
semantic paint brushes. This means that we can paint not in terms
of colors, but in terms of concepts. Now this may sound a little nebulous, so if
you look here, you see that as a result, we can grow trees, change buildings, and do all
kinds of shenanigans without requiring us to be able to draw the results by hand. Look at those marvelous results! It works by compressing down these images
into a latent space. This is a representation that is quite sparse
and captures the essence of these images. One of the key ideas is that this can then
be reconstructed by a generator neural network to get a similar image back, however, the
twist is that while we are in the latent domain, we can apply these intuitive edits to this
image, so when the generator step takes place, it will carry through our changes. If you look at the paper, you will see that
just using one generator network doesn’t yield these great results, therefore this
generator needs to be specific to the image we are currently editing. The included user study shows that the new
method is preferred over the previous techniques. Now, like all of these methods, this is not
without limitations. Here you see that despite trying to remove
the chairs from the scene, amusingly, we get them right back. That’s a bunch of chairs free of charge,
in fact, I am not even sure how many chairs we got here. If you figured that out, make sure to leave
a comment about it, but all in all, that’s not what we asked for, and solving this remains
a challenge for the entire family of these algorithms. And, good news, in fact, when talking about
a paper, probably the best kind of news is that you can try it online through a web demo
right now. Make sure to try it out and post your results
here if you find anything interesting. The authors themselves may also learn something
new from us about interesting new failure cases. It has happened before in this series. This episode has been supported by Weights
& Biases. Weights & Biases provides tools to track your
experiments in your deep learning projects. It is like a shared logbook for your team,
and with this, you can compare your own experiment results, put them next to what your colleagues
did and you can discuss your successes and failures much easier. It takes less than 5 minutes to set up and
is being used by OpenAI, Toyota Research, Stanford and Berkeley. It was also used in this OpenAI project that
you see here, which we covered earlier in the series. They reported that experiment tracking was
crucial in this project and that this tool saved them quite a bit of time and money. If only I had an access to such a tool during
our last research project where I had to compare the performance of neural networks for months
and months. Well, it turns out, I will be able to get
access to these tools, because, get this, it’s free and will always be free for academics
and open source projects. Make sure to visit them through wandb.com
or just click the link in the video description and sign up for a free demo today. Our thanks to Weights & Biases for helping
us make better videos for you. Thanks for watching and for your generous
support, and I’ll see you next time!

61 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *