{"id":925,"date":"2021-03-02T16:32:34","date_gmt":"2021-03-02T16:32:34","guid":{"rendered":"http:\/\/nitk.acm.org\/blog\/?p=925"},"modified":"2021-03-02T16:32:34","modified_gmt":"2021-03-02T16:32:34","slug":"slow-motion-merely-a-change-in-refresh-rate-or-a-lot-more","status":"publish","type":"post","link":"https:\/\/nitk.acm.org\/blog\/2021\/03\/02\/slow-motion-merely-a-change-in-refresh-rate-or-a-lot-more\/","title":{"rendered":"Slow Motion Merely a Change in Refresh Rate or a Lot More?"},"content":{"rendered":"\n<p>Typically slow-motion is achieved when each\u00a0<a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/en.wikipedia.org\/wiki\/Film\">film<\/a>\u00a0frame is captured at a\u00a0<a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/en.wikipedia.org\/wiki\/Framerate\">rate<\/a>\u00a0much faster than it will be played back. When replayed at normal speed, time appears to be moving more slowly. A term for creating a slow-motion film is over cranking which refers to hand cranking an early camera at a faster rate than normal (i.e. faster than 24 frames per second). Slow-motion can also be achieved by playing normally recorded footage at a slower speed. This technique is more often applied to video subjected to\u00a0<a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/en.wikipedia.org\/wiki\/Instant_replay\">instant replay<\/a>\u00a0than to film. A third technique that is becoming common using current computer software post-processing (with programs\u00a0<a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/en.wikipedia.org\/wiki\/Like\">like<\/a>\u00a0<a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/en.wikipedia.org\/w\/index.php?title=Twixtor&amp;action=edit&amp;redlink=1\">Twixtor<\/a>\u00a0and Kandao) is to fabricate\u00a0<a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/en.wikipedia.org\/wiki\/Motion_interpolation\">digitally interpolated<\/a>\u00a0frames to smoothly transition between the frames that were actually shot.<\/p>\n\n\n\n<p>Before we understand how slow-motion works, we need to unravel what exactly refresh rate is, and to do that, we need to know how displays work. There\u2019s a lot of technical stuff involved here, but at its most basic level, a display works by showing you a series of images, or \u201cframes\u201d. To make a video, displays need to show a series of frames, one after another. The \u201crefresh rate\u201d is how many times an image is updated per second. So, a 60Hz display refreshes its image 60 times a second. This is obviously too fast for your brain to track, so it\u2019s tricked into thinking it is watching a moving image rather than a series of single frames.<\/p>\n\n\n\n<p>A higher refresh rate means more images are shown in the same amount of time, which means any movement between each frame seems smoother. Because there are more frames, it reduces the gap between individual frames. While not something you\u2019re likely to consciously notice, most people can feel some difference between refresh rates. So, the whole process will feel more responsive, as it seems to react more quickly to your commands.<\/p>\n\n\n\n<p>Smartphones are getting more and more powerful, but with last generation\u2019s hardware still holding its own, the jump from generation-to-generation doesn\u2019t seem as great as it once did. Where are manufacturers to go when a new phone doesn\u2019t feel more powerful than last year\u2019s device? One alternative is to make it feel smoother and more responsive \u2014 and a great way to do that is to increase the refresh rate of its display.<\/p>\n\n\n\n<p>It sounds similar to your graphic processor\u2019s frame rate, and that\u2019s because it is. Frame rate is measured in frames-per-second, or \u201cfps\u201d, and that is how quickly a graphics processor can process and deliver individual images to your display. You\u2019ll need a monitor with a refresh rate of at least 120Hz to display 120fps at its finest. However, while the refresh rate is similar to fps, it\u2019s not the same thing. The refresh rate is tied to the monitor itself, while the frame rate is how quickly information is sent to your monitor by your graphics processor.<\/p>\n\n\n\n<p>But a higher refresh rate isn\u2019t just about day-to-day usability. Gaming performance is one of the biggest beneficiaries of a higher refresh rate. A display with a higher refresh rate also has a lower input lag. Input lag is the time between an action being triggered on the display and it taking place in the game. A standard 60Hz display cannot have an input lag faster than 16.63ms because that\u2019s how long it takes for each image to refresh, while a 120Hz display can reach 8.33ms, as it refreshes twice as often.<\/p>\n\n\n\n<p>It\u2019s easy to spot poorly-edited videos when an editor takes standard frame rates and just changes the speed in Adobe Premiere Pro or Final Cut Pro X. Footage looks choppy and awful.<\/p>\n\n\n\n<p>Kandao, makers of the&nbsp;<a target=\"_blank\" href=\"https:\/\/www.bhphotovideo.com\/c\/product\/1430614-REG\/kandao_220464_obsidian_r_professional_3d.html?BI=6857&amp;KBID=7410\" rel=\"noreferrer noopener\">Obsidian<\/a>&nbsp;and&nbsp;<a target=\"_blank\" href=\"https:\/\/www.bhphotovideo.com\/c\/product\/1430616-REG\/kandao_220467_qoocam_interchangeable_3d_360_vr.html?BI=6857&amp;KBID=7410\" rel=\"noreferrer noopener\">QooCam<\/a>&nbsp;line of 360 and VR cameras, has just released software that enables video shooters to take normal video and slow it down by up to 10X while looking still looking smooth. The catch is that the software only works with Kandao cameras \u2014 for now.<\/p>\n\n\n\n<p>Kandao\u2019s software helps that out by using machine learning to fill in the gaps in the footage so that it doesn\u2019t look awful. Depending on the original frame rate the video was shot in, you can even get some insane results, such as footage slowed down to <a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/www.youtube.com\/watch?v=KYT9kJFf8cA\">look as if it was shot with a 1200 fps capable camera<\/a>.<\/p>\n\n\n\n<p>The magic, Kandao says, comes from its machine-learning and neural networks to create smoother interpolated frames compared to existing software, such as Twixtor, which uses optical flow technology.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">NVIDIA \u201cSuper SloMo\u201d Makes Video Smooth<\/h4>\n\n\n\n<p>All of the most recent cameras have raised the bar when it comes to frame rates. It\u2019s common to have access to a camera that shoots 120fps in Full HD these days. There is, however, a set of limitations that we have to deal with, like all sorts of thermal, buffer, and card issues \u2013 key factors that make the price of a true super slow-motion camera extremely expensive.<\/p>\n\n\n\n<p>But again, the advancement of technology seems to be on the content creators\u2019 side as Nvidia has developed a software algorithm to apply super slo-mo to any kind of footage, and all through the power of machine learning. Here\u2019s how you can pull this off on your own.<\/p>\n\n\n\n<p>Basically, in deep learning, you write an algorithm, and then you train it to recognize some patterns while repeating the process down the line.<\/p>\n\n\n\n<p>In this case, the software needs to analyze the movement in the clips and slow it down while creating interpolated sub-frames to have smooth motion in playback. Using this software keep in mind that a 17sec clip took 12 minutes to convert with Cuda, and 6 hours without.<\/p>\n\n\n\n<p>NVIDIA researchers have developed this deep learning-based system that can produce a high-quality slow-motion video from a standard (30 fps) video clip. In comparison with manual slow-motion results, the NVIDIA demonstration video shows far superior smoothness. The technique generates intermediate frames to achieve the super slow-motion effect and as it can generate an indefinite number of such intermediate frames there is no limit to how slow videos can be made to go.<\/p>\n\n\n\n<p>Video demo 1: The paper&nbsp;<em>Super SloMo: High-Quality Estimation of Multiple Intermediate Frames for Video Interpolation<\/em>&nbsp;along with NVIDIA\u2019s presentation at the Computer Vision and Pattern Recognition is the latest research from NVIDIA on such AI-empowered video transformation techniques.&nbsp;<\/p>\n\n\n\n<p>CVPR spotlight video:&nbsp;<a target=\"_blank\" href=\"https:\/\/people.cs.umass.edu\/~hzjiang\/projects\/superslomo\/superslomo_cvpr18_spotlight_v4.mp4\" rel=\"noreferrer noopener\">https:\/\/people.cs.umass.edu\/~hzjiang\/projects\/superslomo\/superslomo_cvpr18_spotlight_v4.mp4<\/a><\/p>\n\n\n\n<p>The paper introduces an end-to-end convolutional neural network for variable-length multi-frame video interpolation, which generates intermediate frame(s) between two consecutive frames to form both spatially and temporally coherent video sequences.<\/p>\n\n\n\n<p>To address the challenge of generating multiple intermediate video frames, researchers first computed bi-directional optical flow between the input images using a U-Net architecture. The flows were then linearly combined at each time step to approximate the intermediate bi-directional optical flows. Although these approximate flows work well in locally smooth regions, they can produce artifacts around motion boundaries. To address this an additional U-Net is employed to refine the approximated flow and predict soft visibility maps. The two input images are then warped and linearly fused to form each intermediate frame. To avoid artifacts, the team applies visibility maps to the warped images before fusion, to exclude the contribution of occluded pixels to the interpolated intermediate frames.<\/p>\n\n\n\n<p>The NVIDIA multi-frame approach outperforms state-of-the-art single frame methods on the Middlebury, UCF101, Slowflow, and High-framerate Sintel datasets. The paper\u00a0<em>Super SloMo: High-Quality Estimation of Multiple Intermediate Frames for Video Interpolation<\/em>\u00a0is on\u00a0<a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/arxiv.org\/pdf\/1712.00080.pdf\">arXiv<\/a><\/p>\n\n\n\n<p><em>-Article by Satya Anirudh , 3rd year Electronics and Communications Engineering<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Typically slow-motion is achieved when each\u00a0film\u00a0frame is captured at a\u00a0rate\u00a0much faster than it will be played back. When replayed at normal speed, time appears to be moving more slowly. A term for creating a slow-motion film is over cranking which refers to hand cranking an early camera at a faster rate than normal (i.e. faster&#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_exactmetrics_skip_tracking":false,"_exactmetrics_sitenote_active":false,"_exactmetrics_sitenote_note":"","_exactmetrics_sitenote_category":0,"footnotes":""},"categories":[26],"tags":[302,300,298,301],"class_list":["post-925","post","type-post","status-publish","format-standard","hentry","category-vidyut","tag-kandao","tag-nivedia-slomo","tag-slow-motion","tag-twixtor"],"_links":{"self":[{"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/posts\/925","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/comments?post=925"}],"version-history":[{"count":2,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/posts\/925\/revisions"}],"predecessor-version":[{"id":927,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/posts\/925\/revisions\/927"}],"wp:attachment":[{"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/media?parent=925"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/categories?post=925"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/tags?post=925"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}