Thankfully, I stumbled upon a paper featured by the Two Minute Papers YouTube channel over the weekend that aims to improve and colorize these videos. The model uses Temporal Neural Network to identify and correct defects such as flickers in vintage videos. [Read: Watch: AI developer upscales famous 1895 train scene to 4K at 60 FPS] Satoshi Iizuka and Edgar Simo-Serra, co-authors of the paper explain that the aim of the model is to perform multiple tasks such as noise removal and colorization to enhance the quality of these videos: To test their neural network’s mettle, researchers also compared how the model performs as compared to older models that aimed to colorize and restore vintage videos. Check out the video below to see that in action; the top-left video is the input, and the bottom right video is the output using this new neural network. While the neural network takes care of the blemishes in the video, developers need to provide a reference image for colorization. To get to that, there are plenty of AI models to colorize old photos around. We haven’t seen commercial usage of this type of model being used by major movie studios to enhance old films. But, with improving neural networks, we can expect that Hollywood and other film industries across the world will take the help of AI in the near future. You can read the full paper here and check out Two Minute Paper’s explanation on the model here.