Neural style transfer algorithms that rely on optimizing an image by gradient descent are painfully slow. You can make them faster by training a network to map the original image to a stylized one that minimizes the loss. I made them even faster by stacking a few Perlin noise channels to the input, using a very fast hand-written edge detection algorithm and stacking that too and then performing only local processing. Unfortunately this gives very poor results and is only not uselessly slow for low-res inputs.
So I gave up on that and started reading about non-ML style transfer and texture synthesis algorithms, it’s a fascinating topic. In the end I managed to create an algorithm that can perform real time poor high resolution style transfer, by using numba to compile python code. Sadly when I tried to turn that into an Android app, despite trying really hard, I found that there was no way to run numba on Android, and using numpy directly isn’t fast enough, so I gave up on that too.
Neural style transfer algorithms that rely on optimizing an image by gradient descent are painfully slow. You can make them faster by training a network to map the original image to a stylized one that minimizes the loss. I made them even faster by stacking a few Perlin noise channels to the input, using a very fast hand-written edge detection algorithm and stacking that too and then performing only local processing. Unfortunately this gives very poor results and is only not uselessly slow for low-res inputs.
So I gave up on that and started reading about non-ML style transfer and texture synthesis algorithms, it’s a fascinating topic. In the end I managed to create an algorithm that can perform real time poor high resolution style transfer, by using numba to compile python code. Sadly when I tried to turn that into an Android app, despite trying really hard, I found that there was no way to run numba on Android, and using numpy directly isn’t fast enough, so I gave up on that too.