Creating Your Own Ink-washing Paintings
We proposed an interactive platform to engage people’s interest in sketching their own ink-washing painting. This platform is designed for ordinary people without any programming or data science experience. We choose to build this platform with Google Colab since it is free of charge and provides access to GPU.
In the notebook, people will learn how to apply our pre-trained StyleGAN 2 ADA model to generate their own ink-washing paintings. In total, we will consider four functions of our model:
- Image generation: they will learn to use our model to generate a new ink-washing paint that never existed in the world before.
- Linear interpolation: roughly speaking, linear interpolation means moving between two points. So, it enables us to see how one image transfers to another image in our model.
- Latent space exploration: we will extend the idea of linear interpolation to discuss important factors which govern the style of the output image.
- Generate tailored ink-washing painting: by utilizing above applications, you can create your own painting by playing around with different feature vectors.
For those who are comfortable with Python, it is strongly encouraged to play around with these codes and explore other interesting applications of this model.
Interactive platform:
Image generation
Users only need to fill in the table shown in Fig. 1 for image generation. The meanings of each row are listed below:
- random_sampling: if you tick this choice, you will receive randomly generate images from our model
- seed: if not random sampling, then you need to specify the seed (i.e., input any number) for generating an image.
- num_of_image: number of images that you expect to receive eventually
- noise_mode: either constant or random noise mode.
- truncation_psi: it is an abstract idea. You are encouraged to try different values and see their differences.
- save_output: saving the results to drive or not
- display_model: there are three options for displaying the generated images. You can choose the one that you like
- file_name: the name of the file for saving the outputs
For reference, the output for the values used in Fig.1 is shown Fig. 2.
Fig. 1
Fig. 2
Linear interpolation
Similarly, users can perform linear interpolation between two points through the table shown in Fig 3. Most of the parameters are similar to that in Fig 1 and the meaning of some newly added parameters are listed below:
- start_seed: the staring image
- end_seed: the ending image
- num_of_frames: number of step in linear interpolation
- space: linear interpolation in Z-space or in W-space
Fig 4. shows the output of the values used in Fig.3.
Fig. 3
Fig. 4
Latent space exploration
One of the most important approaches for us to understand our model and the dataset is to explore the features vectors. By the table shown in Fig. 5, users are able to examine the effect of each feature vector. The meanings of some newly added parameters are list below:
- feature_vector_index: indicate the feature vector that users would like to test
- step_size: it can be interpreted as the strength of linear interpolation. A larger value will result in a stronger change
We can also see the output in Fig. 6.
Fig. 5
Fig. 6
Generate tailored ink-washing painting
By extending the ideas of feature vectors and linear interpolation, we developed a UI for users to customize the computer generated image by jointly modifying the features vectors, as shown in Fig. 7
Fig. 7
To conclude, the proposed interactive platform provides a bridge for non-professional users to taste and appreciate the power of the state-of-art deep learning model and the traditional Chinese ink-washing painting. There are other applications of the model such as image projection. Interested users are strongly encouraged to explore more.