YouTube QoE evaluation tool for Android wireless terminals
© Gómez et al.; licensee Springer. 2014
Received: 21 January 2014
Accepted: 30 September 2014
Published: 10 October 2014
In this paper, we present an Android application which is able to evaluate and analyze the perceived quality of experience (QoE) for YouTube service in wireless terminals. To achieve this goal, the application carries out measurements of objective quality of service (QoS) parameters, which are then mapped onto subjective QoE (in terms of mean opinion score, MOS) by means of a utility function. Our application also informs the user about potential causes that lead to a low MOS as well as provides some hints to improve it. After each YouTube session, the users may optionally qualify the session through an online opinion survey. This information has been used in a pilot experience to correlate the theoretical QoE model with real user feedback. Results from such an experience have shown that the theoretical model (taken from the literature) provides slightly more pessimistic results compared to user feedback. Users seem to be more indulgent with wireless connections, increasing the MOS from the opinion survey in about 20% compared to the theoretical model, which was obtained from wired scenarios.
KeywordsQuality of experience Mean opinion score YouTube Android
Real-time entertainment services (comprised mostly of streaming video and audio) are becoming one of the dominant web-based services in telecommunications networks. In particular, YouTube service is currently the largest single source of real-time entertainment traffic and the third most visited Internet site (preceded by Google and Facebook). It has emerged to account for more Internet traffic than any other service. Mobile networks have the highest proportion of real-time entertainment traffic. Nowadays, YouTube leads the way, accounting for 20% to 25% of total traffic in mobile networks. Additionally, 27.8% of all YouTube traffic (first half 2012) has been consumed on a smartphone or tablet .
The combination of increasing device capabilities, high-resolution content, and longer video duration (largely due to live content) means that YouTube’s growth will continue for the foreseeable future. Driven by higher bitrates and enhanced capabilities of mobile devices, the trend is also going towards high-definition (HD) video, which considerably enhances quality demand. That is the reason mobile network operators are following this trend, as it will be hugely influential on network requirements and subscriber quality of experience. In the context of video streaming services, quality of service (QoS) measurements become non-sufficient for evaluating the overall quality because they do not take into account user satisfaction. As a consequence, many network operators are starting to study the evaluation of quality of experience (QoE), which can be considered as the overall performance of a system from a user’s perspective.
The QoE has been usually evaluated through subjective tests carried out on the users in order to assess their degree of satisfaction with a mean opinion score (MOS) indicator . This type of approach is obviously quite expensive, as well as annoying to the user. That is why in recent years, new methods have been used to estimate the QoE based on certain performance indicators associated with services. The evaluation methodology used by most network operators to obtain statistical QoE is based on field testing. These tests often use mobile handsets as a modem, with laptop computers that perform the tests and collect statistics. However, this process is expensive in terms of resources and staff, and it does not use the entire protocol stack implemented in the terminal. These drawbacks are solved by integrating QoE analyzers in the mobile terminal itself so that quality measurements are specific to each terminal. Using this approach, additional measurements can be collected (along the protocol stack) to allow for enhanced analysis of the performance of each service. Furthermore, if mobile terminals are able to report the measurements to a server, the QoE assessment process is simplified significantly.
Recently, a number of works have focused on developing subjective QoE evaluation frameworks for mobile users. For instance, an implementation of a QoE measurement framework on Android platform is presented in [3, 4], although results are limited to a laboratory environment. The works in [5–7] present a framework for measuring the QoE for distorted videos in terms of peak signal-to-noise ratio (PSNR) or a modified metric called cPSNR, respectively. A non-intrusive video quality estimator based on parametric models that considers the group of pictures (GoP) length, type of frames, packet loss rate, etc. is presented in . A QoE framework for multimedia services (named as QoM) for run time quality evaluation of video streaming services is presented in ; this approach is based on the influence of QoE factors and various network and application-level QoS parameters, although no evaluation of the proposed framework in a context of a real wireless network has been performed. In , the problem of YouTube QoE monitoring from an access provider’s perspective is investigated, showing that it is possible to detect application-level stalling events by using network-level passive probing only. The work in  describes a tool that monitors YouTube application comfort, making it possible to estimate the time when the YouTube player is stalling.
Other works are focused on specific YouTube models to compute the QoE. In [12, 13], different QoE YouTube models that take into account the key influence factors in the quality perception (such as stalling events caused by network bottlenecks) are presented. They quantify the impact of initial delays on the user-perceived QoE by means of subjective laboratory and crowdsourcing studies. Other works are devoted to estimate the MOS for video services [3, 14, 15]; among them, the analysis presented in  provides a utility function for HTTP video streaming as a function of three application performance metrics: initial buffering time, mean rebuffering time, and rebuffering frequency.
However, none of the previous works has performed a deep validation of existing models through real tests over different radio technologies. In this work, we describe an Android application that carries out measurements of objective QoS indicators associated to YouTube service; these performance indicators are then mapped onto subjective QoE (in terms of MOS). Our application also informs the user about possible causes that lead to a low MOS, and provides some hints to improve it. After each YouTube session, the users may optionally qualify the session through an opinion survey. This information has been used in a pilot experience to correlate the theoretical QoE model with real user feedback.
The remainder of this paper is structured as follows. A description of the YouTube QoE evaluation method is given in Section 2, specifying its main performance indicators. In Section 3, we describe our Android application for YouTube QoE evaluation. The results from a YouTube evaluation pilot experience are analyzed in Section 4. Finally, some concluding remarks are given in Section 5.
2. YouTube QoE evaluation method
YouTube service employs progressive download technique, which enables the playback of the video before the content downloaded is completely finished . Old YouTube delivery service for mobile terminals (through the mobile YouTube link http://m.youtube.com) was based on conventional video streaming architecture, i.e., Real Time Streaming Protocol (RTSP) and Real Time Protocol (RTP), the latter being transported over User Datagram Protocol (UDP). However, the current delivery service (both for smartphones and PCs) uses progressive video download via HyperText Transfer Protocol (HTTP) over Transmission Control Protocol (TCP).
Nowadays, TCP is the preferred transport protocol for YouTube and other video servers since the majority of video content delivery over the Internet is not live and most users’ bandwidth is usually greater than the video coding rate [17, 18]. The HTTP/TCP architecture also solves the problem of access blocking carried out by many firewalls for unknown UDP ports. Additionally, the continuous improvements in latency reduction and throughput maximization achieved in new cellular technologies have allowed the use of TCP for minimizing the impact of errors without reducing severely the effective throughput.
The video data transfer from the media server to the client consists of two phases: initial burst of data and throttling algorithm . In the initial phase, the media server sends an initial burst of data (whose size is determined by one of the setup parameters) at the maximum available bandwidth. Then, the server starts the throttling algorithm, where the data are sent at a constant rate (normally at the video clip encoding rate multiplied by the throttle factor, also denoted in the setup parameters). In a network congestion episode, the data that are not able to be delivered at this constant rate are buffered in the server and released as soon as the congestion is alleviated. When this occurs, data are sent at the maximum available bandwidth. Whenever the player’s buffer runs out of data, the playback will be paused, leading to a rebuffering event.
Like quality of Internet services in general, Internet video streaming quality is mainly depending on throughput. However, quality requirements in terms of throughput are more demanding than those for other popular Internet applications as file download, web browsing, and messaging. The main differences are that throughput has to meet rather precise requirements and that these requirements are stream-specific, i.e., if data are not transmitted according to the playing rate (corrected by the influence of initial buffering), a rebuffering will likely occur and user QoE will drop down rapidly. It is therefore essential not only to measure the download throughput, but also to check against the bitrate the individual stream is encoded with.
There exist many quality metrics to characterize the video quality. Some of them are based on comparing the received (and degraded) video with the original video (usually called ‘reference’). Examples of this type of quality metrics are as follows: mean square error (MSE) , peak signal-to-noise ratio (PSNR) , video structural similarity (VSSIM) , perceptual evaluation of video quality (PEVQ) , and video quality metric (VQM) . This type of metrics is useful for obtaining objective metrics in controlled experiments, but these metrics are not applicable for online (real-time) procedures as the full reference is not available. Furthermore, they are suited to measure the image quality degradation, e.g., due to packet losses or compression algorithms. Since using TCP, packet losses are recovered, this type of metrics is less useful for YouTube.
That is why other works are oriented to provide a model for estimating the video quality without a reference. For instance, the work described in  presents a regression model to estimate the visual perceptual quality in terms of MOS for MPEG-4 videos over wireless networks. However, this algorithm requires an image reconstruction process to evaluate the differences between the original and the resulting images (after network transmission), which makes it not adequate for online quality estimations. In , the impact of delay and delay variation on user’s perceived quality for video streaming is analyzed. However, it does not consider other objective metrics such as resolution, frame rate, or packet losses, which are also important for obtaining an accurate QoE estimation. In , a no-reference subjective metric to evaluate the video quality is presented, which considers the frame rate or the picture resolution, although their computation is complex to be used real-time.
Estimate (through modeling) or measure network QoS (e.g., throughput, round trip time, loss rate, etc.).
Convert network QoS metrics onto application QoS (application performance metrics) by means of protocols’ modeling.
Map application QoS onto end user’s QoE (in terms of MOS).
It should be noticed that the first step might not be needed if the mobile terminal is equipped with a customized YouTube client that directly monitors and reports the application performance metrics. Otherwise, the mobile terminal shall be able to convert the network QoS onto application QoS by specific protocol modeling. For instance, there are different TCP performance models to estimate TCP throughput from network QoS [26, 27]. Afterwards, application performance metrics can be estimated at the receiver from performance indicators at lower layers (e.g., TCP throughput) as well as other parameters like the video coding rate, video length, buffer size at the receiver, or the minimum buffer threshold that triggers a rebuffering event (see  for further details).
The third step is performed by applying a utility function for HTTP video streaming as a function of three application performance metrics:
Initial buffering time (Tinit): time elapsed until a certain buffer occupancy threshold has been reached so the playback can start
Rebuffering frequency (frebuf): frequency of interruption events during the playback
Mean rebuffering time (Trebuf): average duration of a rebuffering event
Three valued levels of application performance metrics
0 to 1 s
0 to 0.02
0 to 5 s
1 to 5 s
0.02 to 0.15
5 to 10 s
From the previous equation, it can be seen that the maximum predicted MOS is 4.23. Coefficients for all service levels (Lti, Lfr, and Ltr) are negative as any increase in them would lead to worse service quality. Moreover, the rebuffering frequency metric (frebuf, quantized as Lfr) has the highest impact on the end user’s QoE (regression coefficient −0.742), compared to the initial buffering time (regression coefficient −0.06) and the rebuffering duration (regression coefficient −0.106). In this respect, it is reasonable to think that the perceived quality does not only depend on the pause intensity (percentage of time in the pause state), since a higher number of pauses (with lower pause durations) seems more annoying to the user.
3. Android application for YouTube QoE evaluation
Intrusive mode: Our application includes an embedded video player, thus providing access to the content being consumed through the YouTube Application Programming Interface (API).
Transparent mode: Our application runs in the background, so monitoring functionalities are associated to YouTube sessions established either through the native YouTube application or through the web browser.
Our Android application includes the following modules:
Monitoring: This module is responsible for monitoring network QoS parameters as well as other configuration parameters as required to estimate the application performance monitoring (listed in the previous section). It makes use of the Android Networking and YouTube Data API to get a number of parameters associated to the session, as detailed next.
QoE estimation: This is in charge of (automatically) computing the QoE of a YouTube session (in terms of MOS) from QoS parameters, according to Equation 1.
QoE advices: This informs the user about possible causes that lead to a low MOS and provides some hints to improve it.
QoE user feedback: This allows users to qualify the session through an opinion survey. This information is used to correlate the QoE model with real user feedback.
QoE reporting: This module is responsible for reporting all the performance indicators to a QoE server for post-processing purposes.
List of parameters that are reported (from the terminal) to the QoE server
International Mobile Equipment Identity (15-digit format)
It indicates the application used for video reproduction:
1 - Embedded video player (based on media player)
2 - YouTube native application
3 - Web browser
Total reproduction time (in ms) including rebuffering and user-originated pauses
Date of the YouTube video session (AAAA-MM-DD)
Hour of the YouTube video session (HH:MM:SS)
Application Performance Metrics
InitialBuffering Time (Tinit)
Total time (in ms) since the user starts the session until the video is ready to be played
Rebuffering Frequency (frebuf)
Frequency of interruption events (not forced by the user) during the playback
Mean Rebuffering Time (Trebuf)
Average duration of a rebuffering event (in ms)
Location of the measurement
Expressed in sexagesimal degrees (−90, 90)
Expressed in sexagesimal degrees (−180, 180)
Expressed in meters above sea level
Precision in meters of the location measurements
Moment at which the location measurement was done (AAAA-MM-DD_HH:MM:SS format)
Method to perform location measurements: GPS or network-assisted
Type of network data connection active for the session. Possible values: 0(WIFI), 1(GPRS), 2(EDGE), 3(UMTS), 4(CMDA), 5(EVDO_0), 6(EVDO_A), 7(1XRTT), 8(HSDPA), 9(HSUPA), 10(HSPA), 11(IDEN), 12(EVDO_B), 13(LTE), 14(EHRPD), 15(HSPAP)
Location area code where the user is located
Identifier of the cell providing service to the terminal
Received signal strength indication (dBm) measured by the terminal (for either WiFi or cellular connections)
Subjective quality (feedback from users)
Video Quality Feedback
Subjective opinion regarding video quality (scale: 1 to 5)
Audio Quality Feedback
Subjective opinion regarding audio quality (scale: 1 to 5)
General feedback from the user (scale: 1 to 5)
The user can add any additional comment
Subjective quality (estimated)
Estimated Video Quality
Estimated video quality from QoE model (scale: 1 to 5)
Our application uses principally standard libraries from Android SDK. Regarding the collection of network statistics, Android terminals allow to get radio-related parameters and performance indicators like the received signal strength as well as traffic-related statistics via standard libraries, like android.telephony or android.net. Time tagging (in milliseconds) associated to performance indicators is performed via standard time class, whereas location information is obtained from Android Location Manager (selecting network providers and GPS, if available).
However, other relevant information is not accessible through the standard Android API, which is more oriented towards applications of general interest. For instance, YouTube application performance metrics (Tinit, frebuf, Trebuf) and some parameters related to the YouTube session (like video duration) cannot be directly obtained from Android API. We have discarded the use of Android packet sniffing capabilities (in the form of .pcap files) as they require the terminal to run with root privileges.
When our QoE tool runs in intrusive mode (i.e., player embedded in the application), the measurement of the three application performance metrics is straightforward. For this purpose, we use YouTube API, which provides the capability not only to embed a YouTube player in your own application or to control a standalone YouTube player, but also to get certain information about the ongoing YouTube session. Concretely, application performance metrics can be obtained with this API as described next.
The YouTube player has internally several states: unstarted, buffering, playing, paused, and ended. The media player is on unstarted state when it is first loaded, but as soon as it starts downloading the video file, the player changes its state to buffering. When the buffer has enough data, the playback starts, thus changing to playing state. The paused state is given if the video play-out is paused by the user. The player may go back to the buffering state if the buffer runs out of data during the playback due to congestion problems (called rebuffering event); otherwise, the playback will continue until achieving the ended state. Therefore, since the YouTube API provides information about the player state at each moment, it is easy to compute: 1) the initial buffering time (Tinit) as the time elapsed in the buffering state for the first time, 2) rebuffering frequency (frebuf) from the number of times that the player enters the buffering state during playback, and 3) mean rebuffering time (Trebuf) from the time elapsed in the buffering state.
However, when our QoE tool runs in transparent mode, the computation of these metrics is not so easy as we do not have access to the transitions among states, so they have to be estimated from network-level metrics, as detailed in . In particular, the following basic information is required: average TCP throughput, average playing rate, and player buffer size. The model to estimate application QoS metrics from network QoS is valid under certain assumptions: 1) The network bandwidth, round trip time (RTT), and packet loss rate are assumed to be constant during the video download, and 2) the client does not interact with the video during the playback, such as pausing and forward/backward. However, these assumptions may not be very realistic due to the fact that, as throughput and playing rate may vary along the time, player’s buffer utilization depends on the instantaneous throughput and play-out rate rather than their average values. Therefore, this approach is expected to provide slightly optimistic results.
Examples of causes of low QoE and advices to users (QoE advices module)
High traffic load
IF many applications synchronizing
Temporarily stop data synchronization
ELSE IF many apps running
Offer some apps/services to be switched off
Switch to other technology (WiFi, mobile)
Low network traffic and connected to a cellular network
IF GSM/3G lock on 2G
ELSE IF low RSSI and WiFi available
Switch to a WiFi connection
ELSE IF low RSSI and WiFi switched on and WiFi not available
Switch off WiFi to avoid interference
ELSE IF low RSSI and Bluetooth switched on
Switch off Bluetooth to avoid interference
Low network traffic and connected to WiFi
IF WiFi tethering is activated
Switch off WiFi tethering
ELSE IF Bluetooth switched on
Switch off Bluetooth
Switch to a cellular network connection
Low memory status flag is TRUE
IF many apps/services running
Offer some apps/services to be switched off
ELSE IF ‘hungry’ app detected
Offer to switch off ‘hungry’ app
Check for system updates
High CPU load
CPU load is high during a period
IF many apps/services running
Offer some apps/services to be switched off
Low CPU frequency forced
CPU freq low
IF low battery level
OR high battery temperature
Wait until battery gets in better conditions
ELSE IF aggressive power save profile selected
Select a performance oriented profile
Check for system updates
Video requirements exceeds terminal capabilities
YouTube API video source and device HW information
IF device capability < video req.
Try to select less demanding video files, switch off high quality (HQ) option
Low video quality in origin
YouTube API video source information
IF low resolution/coding rate
Select another file of higher quality
The application needs permission from the user to read the phone state, to access location information, to access external storage, and to connect to the Internet. Information is stored in the device until it is reported to the server. In the device, the SQLite3 database, included in the operating system, is used for storing all retrieved information.
The knowledge of the instantaneous YouTube QoE per user may help the operator to perform actions like, e.g.: a) to modify the subscriber priority when a poor performance in a specific location or particular subscriber is detected; b) to set dynamically different bandwidth limits depending on a number of factors like usage patterns, subscriber, location, time of day, and so forth; c) to send notifications to subscribers like advices to improve their QoE.
Network capacity planning: The proposed QoE solution could help on the identification of network bottlenecks and re-dimension the network to ensure the targeted QoE.
Traffic forecasting process based on historical data traffic stored in its database.
Handset and service performance benchmarking: With the growing number of mobile handsets and multimedia content launched onto the market, it is becoming increasingly important for operators to benchmark each individual terminal and measure its performance. This process enables the identification of problematic handsets and the analysis of the cause for the faults. By identifying problematic handsets, operators can quickly make the required adjustments to their network, thus improving the customer experience
Network monitoring and reporting: The proposed QoE solution use passive methods to infer automatically from passive measurements the user perception on the network. The goal is to automatically derive user perception from specific indicators being accessed purely from monitoring (eliminating the need for customer surveys) both from the network and terminal sides.
Customer care: The ability of linking perceived (subjective) experience with measured (objective) QoE indicators may lead to significant benefits in terms of achieving a better insight onto customer perceived quality in a much more wide approach than the current one based on sampling of specific customers. QoE monitoring solutions are linked onto customer care centers by means of simplified interfaces and overall status for real-time access to customer-specific information, enhancing the response to customer quality and thus satisfaction. Customer care teams can rapidly diagnose problems and identify whether the root cause is linked to a badly performing network, mobile terminal, or application.
4. YouTube QoE pilot experiment
Summary for reproduction time (in minutes)
Number of sessions
Next, statistics related to the application performance metrics (mainly referred to as Tinit, frebuf, and Trebuf, which are required to evaluate MOS) are analyzed in detail. Later, their effect on the experienced quality will be described.
Summary for initial buffering time ( T init ) in seconds
Summary for rebuffering frequency ( f rebuf )
1.88e − 3
5.55e − 3
46e − 3
1.05e − 3
6.73e − 3
1.57e − 3
6e − 3
Summary for number of pauses ( N pauses )
Summary for mean rebuffering time ( T rebuf ) in seconds
Percentage of reported MOS per technology
Average, max, and standard deviation values (in seconds) for T init as per technology and MOS
Average, max, and standard deviation values for f rebuf as per technology and MOS
13e − 3
8.7e − 3
9e − 3
3.6e − 3
6.8e − 3
1.4e − 3
4.13e − 3
2.8e − 3
6e − 3
4.3e − 3
8.6e − 3
2.1e − 3
13e − 3
3.5e − 3
10.4e − 3
1.2e − 3
3.8e − 3
6e − 5
5.5e − 3
5.4e − 4
6.6e − 4
2.9e − 3
7e − 5
4e − 3
5.4e − 4
2e − 5
3.1e − 4
Average, max, and standard deviation values (in seconds) for mean rebuffering time as per technology and MOS
Note that this measurement indicates that MOS is about 20% higher than that given in Equation 1. The reason could be that users could be more indulgent with wireless connections than for wired scenarios under which the original model was obtained.
Due to regression properties, the average value for the difference between MOS as obtained by Equation 2 and that reported by users (that is, the residuals) is 0, although no symmetry around 0 exists (see Figure 9 on the right). Differences between subgroups per technology are not significant (estimated slope of 1.1995 for WiFi connections and 1.2089 for UMTS). It was explored whether a multivariant regression could improve those results. Only linear regression was analyzed as a modification of numerical quantities as those proposed in  cannot be easily included in the multivariant procedure. The adjusted R2 including all available parameters results in 90.5%, only a bit lower (90.46%) if the total rebuffering time is taken out from regression. As this value is lower than that obtained with Equation 1, the heuristical measurement quantization proposed in  increased by 20% seems to be able to predict well the users’ expectations.
This work has presented a QoE evaluation tool for Android terminals which is able to estimate the QoE (in terms of MOS) for YouTube service based on theoretical models. In particular, this tool makes it possible to map network QoS onto the QoE of YouTube sessions. Additionally, a QoE advices module analyzes possible causes that lead to low QoE and subsequently provides particular advices to the user under certain conditions.
Our application has been tested on a pilot experience over 17 Android terminals for 1 month. According to the statistics, most of the responses from the users’ survey match up with theoretical estimations; however, the QoE model provides slightly more pessimistic results than the opinion of the wireless users, probably as the model was initially generated under wired scenarios. In that sense, we propose a modified utility function from taking a linear regression between the theoretical MOS and the MOS reported by users.
In our opinion, it is critical that application developers provide access to the main key performance indicators (KPIs) associated to their services in order to ease the evaluation and analysis of the QoE.
GG received his B.Sc. and Ph.D. degrees in Telecommunications Engineering from the University of Málaga (Spain) in 1999 and 2009, respectively. From 2000 to 2005, he worked at Nokia Networks and Optimi Corporation (recently acquired by Ericsson), leading the area of QoS for 2G and 3G cellular networks. Since 2005, he is an associate professor at the University of Málaga. His research interests include the field of mobile communications, especially QoS/QoE evaluation for multimedia services and radio resource management strategies for LTE and LTE-Advanced.
LH is a telecommunications engineer from E.T.S.I.T - UPM, specialized in computer science. He has worked mainly in defense aerospace industry as a military avionics engineer and in wireless telecommunication industry as systems engineer. He also worked for public administration as a consultant in computer science and also as a contract reviewer. Since 1996, he has worked for Telefónica I + D, always involved in mobile network R&D. Recently, he worked at the area of PDI - Enabling Platforms (SLA) as a computer technology specialist. Currently, he is working at the area of PTI - Capacity & Traffic Analysis & Solutions, developing tools based on knowledge extraction, machine learning, and analytic prediction techniques.
QP is a physicist for the Complutense University of Madrid from 1986. In 1987, he started working in Telefónica I + D, in aspects related with reliability predictions, failure analysis, and components electronic testing, used in telecommunication systems. In 2011, he was working in QoE issues for mobile broadband networks, analyzing and developing tools and client applications to control and manage the QoS and QoE of mobile services. Currently, he is working in GCTO (Global Chief Technology Office) of Telefónica, analyzing and defining the requirements needed for residential cellular gateways.
JL received his B.Sc. in Telecommunications in 1998 in Universidad Politécnica de Madrid. In 1999, he worked for Teldat on ciphering techniques, and since 2000, he is working in Telefónica I + D on several areas related to mobile communications and physical layer performance, involving terminal specifications and testing, link-level and system-level simulations of 3G/3.5G systems, LTE digital signal processing, and LTE-Advanced. He is currently working on quality of experience in wireless networks.
RG received her B.Sc. degree in Telecommunications Engineering in 1998 from Madrid Polytechnics University. Between 1997 and 1998, she staged at the Digital Communication Systems Department at the Technical University Hamburg-Harburg (Germany), where she carried out her master thesis about multipath fading channel modeling. In July 1998, she signed for Telefónica I + D, becoming part of the Radio Communication Systems Department. Her career has oriented towards mobile communications, especially on radio planning and optimization and QoE in LTE.
MCAT received M.S. and Ph.D. degrees in Telecommunications Engineering from the University of Malaga, Spain, in 1994 and 2001, respectively. Currently, she is working at the Department of Communications Engineering, at the same university. Her main research interests include adaptive modulation and coding for fading channels, multi-user OFDM, SC-FDMA, cross-layer design, and probabilistic QoS guarantees for wireless communications.
This work has been partially supported by the Spanish Government and FEDER (TEC2010-18451 and TEC2013-44442).
- Sandvine Corporation: Global Internet Phenomena Report, 1H, 2012. 2013. Available at: http://www.sandvine.com/news/global_broadband_trends.aspGoogle Scholar
- Díaz A, Merino P, Rivas FJ: Customer-centric Measurements on Mobile Phones. Proceedings on 12th IEEE International Symposium on Consumer Electronics Vilamoura, Portugal, 14–16 April 2008, pp. 14–16. doi:10.1109/ISCE.2008.4559470Google Scholar
- Ketykó I, De Moor K, De Pessemier T, Verdejo AJ, Vanhecke K, Joseph W, Martens L, De Marez L: QoE measurement of mobile YouTube video streaming. MoViD’10 Proceedings of the 3rd Workshop on Mobile Video Delivery Firenze, Italy, 25–29 October 2010, pp. 27–32. doi:10.1145/1878022.1878030Google Scholar
- De Moor K, Ketykó I, Joseph W, Deryckere T, De Marez L, Martens L: Proposed framework for evaluating quality of experience in a mobile, testbed-oriented living lab setting. Mobile. Netw. Appl. 2010, 15(3):378-391. doi:10.1007/s11036-010-0223-0 10.1007/s11036-010-0223-0View ArticleGoogle Scholar
- Alvarez A, Cabrero S, Paneda XG, Garcia R, Melendi D, Orea R: A flexible QoE framework for video streaming services. IEEE GLOBECOM Workshops Houston, Texas, USA, 5–9 December 2011, pp. 1226–1230. doi:10.1109/GLOCOMW.2011.6162377Google Scholar
- Ghareeb M, Viho C: Hybrid QoE Assessment is Well-Suited for Multiple Description Coding Video Streaming in Overlay Networks. 2010 Eighth Annual Communication Networks and Services Research Conference (CNSR) Montreal, QC, Canada, 11–14 May 2010, pp. 327–333. doi:10.1109/CNSR.2010.15Google Scholar
- Klaue J, Rathke B, Wolisz A: EvalVid - a framework for video transmission and quality evaluation. Proceedings of 13th International Conference on Modelling Techniques and Tools for Computer Performance Evaluation Urbana, IL, USA, 2–5 September 2003, pp. 255–272. doi:10.1007/978-3-540-45232-4_16Google Scholar
- Aguiar E, Riker A, Cerqueira E, Abelém A, Mu M, Braun T, Curado M, Zeadally S: A real-time video quality estimator for emerging wireless multimedia systems. Wireless Networks 2014, 20(8):1-18. doi:10.1007/s11276-014-0709-yGoogle Scholar
- Laghari K, Pham TT, Nguyen H, Crespi N: QoM: A New Quality of Experience Framework for Multimedia Services. Proc. IEEE Symp. Computers and Communications (ISCC) Cappadocia, Turkey, 1–4 July 2012, pp. 851–856. doi:10.1109/ISCC.2012.6249408Google Scholar
- Schatz R, Hoβfeld T, Casas P: Passive YouTube QoE monitoring for ISPs. Proc. of Sixth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS) Palermo, Italy, 4–6 July 2012, pp. 358–364. doi:10.1109/IMIS.2012.12Google Scholar
- Staehle B, Hirth M, Pries R, Wamser F, Staehle D: YoMo: a YouTube Application. Comfort Monitoring Tool. Proc. of EuroITV Workshop QoE for Multimedia Content Sharing (QoEMCS) Tampere, Finland, 9 June 2010Google Scholar
- Hossfeld T, Seufert M, Hirth M, Zinner T, Tran-Gia P, Schatz R: Quantification of YouTube QoE via crowdsourcing. IEEE International Symposium on Multimedia (ISM) Dana Point, California, USA, 5–7 December 2011, pp. 494–499. doi:10.1109/ISM.2011.87Google Scholar
- Hossfeld T, Schatz R, Egger S, Fiedler M, Masuch K, Lorentzen C: Initial Delay vs. Interruptions: Between the Devil and the Deep Blue Sea. Proc. of 4th International Workshop on Quality of Mulitmedia Experience (QoMEX) Yarra Valley, Victoria, Australia, 5–7 July 2012, pp. 1–6. doi:10.1109/QoMEX.2012.6263849Google Scholar
- Porter T, Peng X: An objective approach to measuring video playback quality in lossy networks using TCP. IEEE Commun. Lett. 2011, 15(1):76. doi:10.1109/LCOMM.2010.110310.101642View ArticleGoogle Scholar
- Mok RKP, Chan EWW, Chang RKC: Measuring the Quality of Experience of HTTP Video Streaming. Proceedings of the 12th IFIP/IEEE International Symposium on Integrated Network Management (IFIP/IEEE IM) Dublin, Ireland, 23–27 May 2011, pp. 485–492. doi:10.1109/INM.2011.5990550Google Scholar
- Gill P, Arlitt M, Li Z, Mahanti A: YouTube traffic characterization: a view from the edge. Proc. of the 7th ACM SIGCOMM Conference on Internet Measurement San Diego, California, USA, 24–26 October 2007, pp. 15–28. doi:10.1145/1298306.1298310Google Scholar
- Ameigeiras P, Ramos-Munoz JJ, Navarro-Ortiz J, Lopez-Soler JM: Analysis and modeling of YouTube traffic. Trans. Emerg. Telecommunications Technol. 2012, 23(4):360-377. doi:10.1002/ett.2546 10.1002/ett.2546View ArticleGoogle Scholar
- ULC Sandvine Incorporated: White paper: “Measuring Over-the-Top Video Quality”. 2013. Available from: https://www.sandvine.com/downloads/general/whitepapers/measuring-over-the-top-video-quality.pdfGoogle Scholar
- Gonzalez RC, Wintz P: Digital Image Processing. 2nd edition. Addison-Wesley, Essex, England; 1987.MATHGoogle Scholar
- Wang Z, Lu L, Bovik AC: Video quality assessment based on structural distortion measurement. J. Signal Process: Image Commun. 2004, 19(2):121-132. doi:10.1016/S0923-5965(03)00076-6 10.1016/S0923-5965(03)00076-6Google Scholar
- ITU-T: Recommendation J.247. Objective perceptual multimedia video quality measurement in the presence of a full reference. 2008. Available at: http://www.itu.int/rec/T-REC-J.247-200808-I/enGoogle Scholar
- ITU-T: Recommendation J.144. Objective perceptual video quality measurement techniques for digital cable television in the presence of a full reference. 2004. Available at: http://www.itu.int/rec/T-REC-J.144-200403-I/enGoogle Scholar
- Khan A, Sun L, Ifeachor E: Content-based video quality prediction for MPEG4 video streaming over wireless networks. J. Multimedia 2009, 4(4):228-239. doi:10.4304/jmm.4.4.228-239View ArticleGoogle Scholar
- Bhamidipati VD, Kilari S M. Thesis. In Effect of Delay/Delay Variable on QoE in Video Streaming. School of Computing at Blekinge Institute of Technology, Sweden; 2010.Google Scholar
- Janowski L, Romaniak P: QoE as a function of frame rate and resolution changes. Proc. of the 3rd International Workshop on Future Multimedia Networking (FMN) Krakow, Poland, 17–18 Jun 2010, pp. 34–45. doi:10.1007/978-3-642-13789-1_4Google Scholar
- Padhye J, Firoiu V, Twosley D, Kurose J: Modeling TCP Reno performance: a simple model and its empirical validation. IEEE/ACM Trans. Networking 2000, 8(2):133-145. doi:10.1109/90.842137 10.1109/90.842137View ArticleGoogle Scholar
- Zheng D, Lazarou GY, Hu R: A Stochastic Model for Short-lived TCP Flows. Proceedings of the IEEE International Conference on Communications Anchorage, Alaska, USA, 11–15 May 2003, pp. 76–81. doi:10.1109/ICC.2003.1204146Google Scholar
- Fiedler M, Hossfeld T, Tran-Gia P: A generic quantitative relationship between quality of experience and quality of service. IEEE Netw. 2010, 24(2):36-41. doi:10.1109/MNET.2010.5430142View ArticleGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.