Updated: Dec 10, 2019
The goal for this practice is to actually understand just a small bit of how Runway can be communicated through browser. It has been a really rough semester trying to follow through everything in the machine learning world, so I really want to at least clarify one really small part of the whole picture.
I had couple ideas, which is to use attnGan to generate images from lyrics and then to use these generated images to style transfer the original music videos or to train the images in StyleGan as database to create an animation of them morphing to another along with the song to make a music video
Breakers by Gem Club:
Into their cheering hands
And the faces of his friends
No blood turned so bitter
And skulls stayed simple
Every body in its place
Strange you should want it the same
Breakers in my lungs
For the first idea, the problem I went into was that I actually cannot get the original footage because of copy right, but there is a 7second screenshot of it.
For the second idea,I thought it would be really interesting to see how machine learning generates graphics through the lyrics since they are very vague and poetic and then seeing them morphing along the actual lyrics in the song. However after watching all the tutorials and when I was ready to try it out I realized the train function is only for an "upgraded" version that needs to be paid. From there, I decided to just figure out two things:
1. how to use styleGane in Runway and send generated images into p5.js browser
2. How to make morphing videos of those generative pictures using Runway and p5.js
Here are some results.
During the process, I was following through using httpPost() to send single image from Runway to p5.js. First from copying a single image's json code into the browser to show and then generating random(-1,1) for the z values to generate random pictures until at the end finally export videos on the browser and out.