Computer Science
Permanent URI for this collectionhttps://hdl.handle.net/10679/43
Browse
Browsing by Author "Ahsan, S."
Now showing 1 - 5 of 5
- Results Per Page
- Sort Options
Conference ObjectPublication Metadata only Benchmarking the second edition of the omnidirectional media format standard(IEEE, 2022) Kara, Burak; Akçay, Mehmet Necmettin; Beğen, Ali Cengiz; Ahsan, S.; Curcio, I. D. D.; Kammachi-Sreedhar, K.; Aksu, E.; Computer Science; BEĞEN, Ali Cengiz; Kara, Burak; Akçay, Mehmet NecmettinOmnidirectional MediA Format (OMAF) is the first worldwide virtual reality (VR) standard to store and distribute immersive media, completed in 2019. Later, in 2021, the second edition of this standard (OMAF v2) was published. The second edition kept all the features defined in the first OMAF edition while introducing some new ones, such as overlays and multi-viewpoints. OMAF v2's Tile Index Segments that contain metadata to track fragment data per segment and quality levels create a bandwidth overhead. During the OMAF v2 standardization, multiple methods for the track fragment run representation were studied to deal with this overhead. This paper presents the implementation of one of these methods, the compressed box method using the DEFLATE algorithm (OMAF v2*). It also provides comprehensive test results of OMAF v1, OMAF v2 and OMAF v2∗ with various combinations of three tile grids (6x4, 8x6 and 12x8), three segment durations (300 ms, 900 ms and 3 s), two videos (RollerCoaster and Timelapse), two bitrate groups (each group with four different bitrates) and two HTTP versions (HTTP/1.1 and H2).ArticlePublication Metadata only Could head motions affect quality when viewing 360° videos?(IEEE, 2023-04-01) Kara, Burak; Akçay, Mehmet Necmettin; Beğen, Ali Cengiz; Ahsan, S.; Curcio, I. D. D.; Aksu, E. B.; Computer Science; BEĞEN, Ali Cengiz; Kara, Burak; Akçay, Mehmet NecmettinMeasuring quality accurately and quickly (preferably in real time) when streaming 360° videos is essential to enhance the user experience. Most quality-of-experience metrics have primarily used viewport quality as a simple surrogate for such experiences at a given time. While this baseline approach has been later augmented by some researchers using pupil and gaze tracking, head tracking has not been considered in enough detail. This article tackles whether head motions can influence the perception of 360° videos. Inspired by the latest research, this article conceptualizes a head-motion-aware metric for measuring viewport quality. A comparative study against existing head-motion-unaware metrics reveals sizeable differences. Motivated by this, we invite the community to research this topic further and substantiate the new metric's validity.Conference ObjectPublication Metadata only Head-motion-aware viewport margins for improving user experience in immersive video(ACM, 2022-01-10) Akçay, Mehmet Necmettin; Kara, Burak; Ahsan, S.; Beğen, Ali Cengiz; Curcio, I.; Aksu, E.; Computer Science; BEĞEN, Ali CengizViewport-dependent delivery (VDD) is a technique to save network resources during the transmission of immersive videos. However, it results in a non-zero motion-to-high-quality delay (MTHQD), which is the delta time from the moment where the current viewport has at least one low-quality tile to when all the tiles in the new viewport are rendered in high quality. MTHQD is an important metric in the evaluation of the VDD systems. This paper improves an earlier concept called viewport margins by introducing head-motion awareness. The primary benefit of this improvement is the reduction (up to 64%) in the average MTHQD.Conference ObjectPublication Metadata only Quality upshifting with auxiliary I-Frame splicing(IEEE, 2023) Akçay, Mehmet Necmettin; Kara, Burak; Beğen, Ali Cengiz; Ahsan, S.; Curcio, I. D. D.; Kammachi-Sreedhar, K.; Aksu, E.; Computer Science; BEĞEN, Ali Cengiz; Akçay, Mehmet Necmettin; Kara, BurakThis paper introduces the Auxiliary I-Frame Splicing method to reduce bandwidth waste in adaptive streaming. This method involves fetching a high-quality I-frame and splicing it into the already downloaded low-quality segment, resulting in a higher-quality rendering at a lower overhead than replacing the entire low-quality segment. In our experiments with three videos and four quantization parameters, the results show that the bandwidth can be saved up to 87% while still increasing the peak signal-to-noise ratio score by 20% and the video multi-method assessment fusion score by 73%. In the demo, we demonstrate the visual differences between the original and spliced videos.Conference ObjectPublication Metadata only Rate-adaptive streaming of 360-degree videos with head-motion-aware viewport margins(IEEE, 2022) Akçay, Mehmet Necmettin; Kara, Burak; Beğen, Ali Cengiz; Ahsan, S.; Curcio, I. D. D.; Aksu, E.; Computer Science; BEĞEN, Ali Cengiz; Akçay, Mehmet Necmettin; Kara, BurakEfficient use of available bandwidth is vital when streaming 360-degree videos as users rarely have enough bandwidth for a pleasant experience. A promising solution is the combination of viewport-dependent streaming using tiled video and rate adaptation, where the goal is to spend most of the available bandwidth for the viewport tiles. However, head motions resulting in a change in the viewport tiles briefly cause low-quality rendering until the new tiles can be replaced with high-quality versions. Previously, viewport margins-fixed regions around the viewport rendered at a medium quality-were proposed to make the viewport changes less abrupt. Later on, Head-motion-aware Viewport Margins (HMAVM) were implemented to further smooth the transitions at the expense of increased bandwidth consumption. In this paper, we manage the overall bandwidth cost of HMAVMs better by first developing a set of algorithms that trade off the quality of some viewport tiles and then making the margin selection part of the rate-adaptation algorithm.