• No results found

Assessing the Energy Consumption of Smartphone Applications

N/A
N/A
Protected

Academic year: 2021

Share "Assessing the Energy Consumption of Smartphone Applications"

Copied!
158
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

by

Mustafa M. Abousaleh

B.A.Sc., University of British Columbia, 2011

A Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of

MASTER OF APPLIED SCIENCE

in the Department of Electrical and Computer Engineering

c

Mustafa M. Abousaleh, 2013 University of Victoria

All rights reserved. This dissertation may not be reproduced in whole or in part, by photocopying or other means, without the permission of the author.

(2)

Assessing the Energy Consumption of Smartphone Applications

by

Mustafa M. Abousaleh

B.A.Sc., University of British Columbia, 2011

Supervisory Committee

Dr. Thomas E. Darcie, Co-Supervisor

(Department of Electrical and Computer Engineering)

Dr. Stephen W. Neville, Co-Supervisor

(3)

Supervisory Committee

Dr. Thomas E. Darcie, Co-Supervisor

(Department of Electrical and Computer Engineering)

Dr. Stephen W. Neville, Co-Supervisor

(Department of Electrical and Computer Engineering)

ABSTRACT

Mobile devices are increasingly becoming essential in people’s lives. The advance-ment in technology and mobility factor are allowing users to utilize mobile devices for communication, entertainment, financial planning, fitness tracking, etc. As a result, mobile applications are also becoming important factors contributing to user utility. However, battery capacity is the limiting factor impacting the quality of user expe-rience. Hence, it is imperative to understand how much energy impact do mobile apps have on the system relative to other device activities. This thesis presents a sys-tematic studying of the energy impact of mobile apps features. Time-series electrical current measurements are collected from 4 different modern smartphones. Statistical analysis methodologies are used to calculate the energy impact of each app feature by identifying and extracting mobile app-feature events from the overall current sig-nal. In addition, the app overhead energy costs are also computed. Total energy consumption equations for each component is developed and an overall total energy consumption equation is presented. Minutes Lost (ML) of normal phone operations due to the energy consumption of the mobile app functionality is computed for cases where the mobile app is simulated to run on the various devices for 30 minutes. Tutela Technologies Inc. mobile app, NAT, is used for this study. NAT has two main features: QoS and Throughput. The impact of the QoS feature is indistinguishable, i.e. ML is zero, relative to other phone activities. The ML with only the TP feature enabled is on average 2.1 minutes. Enabling the GPS increases the ML on average to 11.5 minutes. Displaying the app GUI interface in addition to running the app fea-tures and enabling the GPS results in an average ML of 12.4 minutes. Amongst the

(4)

various mobile app features and components studied, the GPS consumes the highest amount of energy. It is estimated that the GPS increases the ML by about 448%.

(5)

Contents

Supervisory Committee ii Abstract iii Table of Contents v List of Tables ix List of Figures xi Acknowledgements xviii Dedication xix 1 Introduction 1

1.1 Network Assessment Application . . . 2

1.2 Problem Scope . . . 3

1.3 Problem Statement and Thesis Contributions . . . 4

1.4 Research Partners . . . 4

1.5 Thesis Outline . . . 4

2 Related Work 6 2.1 Mobile Apps Energy Bugs and Energy Hogs . . . 7

2.2 Energy Profiling Tools . . . 9

2.2.1 Trepn Profiler . . . 9

2.2.2 PowerTutor . . . 10

2.2.3 AppScope . . . 10

2.2.4 Monsoon Power Monitor . . . 12

2.3 Energy Profiling Methodologies . . . 15

(6)

2.3.2 Utilization-Based and Time-Split Accounting Policies . . . 19

2.3.3 System-Call Based . . . 20

2.4 Asynchronous Power Behaviour . . . 21

2.5 Battery Profiling . . . 24

2.6 Summary . . . 24

3 Experimental Design 26 3.1 Application Under Test . . . 26

3.1.1 Application Features . . . 27

3.1.2 Database Server . . . 28

3.1.3 Application Modes . . . 28

3.2 Phone Testing Environment . . . 29

3.2.1 Wavefront Laboratory Environment . . . 29

3.2.2 Phone Configuration . . . 29

3.3 Experimental Tests . . . 30

3.3.1 Device Baseline Tests . . . 30

3.3.2 NAT Functional Tests . . . 31

3.3.3 NAT Baseline Tests . . . 33

3.4 Mobile Device Selection . . . 35

3.5 Summary . . . 37

4 Statistical Testing Methodology 38 4.1 QoS and Throughput Impacts . . . 39

4.1.1 Timestamps from Database Server . . . 39

4.1.2 Throughput Test Signature . . . 39

4.1.3 QoS Test Signature . . . 43

4.2 Startup Transient . . . 47

4.3 Signal Denoising . . . 49

4.4 Matched Filters . . . 50

4.4.1 Throughput Template . . . 51

4.4.2 QoS Template . . . 52

4.5 Extracting TP Signal Instances . . . 57

4.5.1 Water Filling Approach . . . 58

4.5.2 Moving Average Filter Smoothing . . . 59

(7)

4.6 Signal-of-Interest Energy Calculations . . . 62

4.6.1 Normalization of Energy Consumption Calculations . . . 62

4.6.2 Energy Consumption Rate Comparison . . . 63

4.6.3 Refining Ranges . . . 66

4.6.4 TP Event Durations . . . 69

4.6.5 Energy Consumption of NAT app Throughput Feature . . . . 71

4.7 Overhead Energy Calculations . . . 72

4.7.1 NAT App Overhead Energy . . . 73

4.7.2 GPS Overhead Energy . . . 75

4.7.3 GUI Interface Overhead Energy . . . 76

4.8 Total Energy Consumption & Minutes Lost . . . 76

4.9 Summary . . . 77

5 Results & Analysis 79 5.1 QoS Feature . . . 79

5.1.1 “QoS” NAT Functional Configuration . . . 79

5.1.2 “QoS+GPS” NAT Functional Configuration . . . 82

5.1.3 “QoS+Throughput” NAT Functional Configuration . . . 84

5.1.4 “QoS+Throughput+GPS” NAT Functional Configuration . . 88

5.2 TP Feature . . . 90

5.2.1 “TP” Functional Configuration . . . 90

5.2.2 “TP+GPS” Functional Configuration . . . 98

5.2.3 “QoS+TP” Functional Configuration . . . 104

5.2.4 “QoS+TP+GPS” Functional Configuration . . . 110

5.3 Overhead Energy . . . 117

5.3.1 NAT App Overhead . . . 117

5.3.2 GPS Overhead . . . 121

5.3.3 GUI Overhead . . . 124

5.4 Total Energy Consumption & Minutes Lost . . . 127

5.5 Summary . . . 132

6 Conclusions & Future Work 133 6.1 Conclusions . . . 133

6.2 Future Work . . . 134

(8)

6.2.2 Improved Statistical Approach . . . 136 6.2.3 Underlying Distributions . . . 136

(9)

List of Tables

Table 3.1 Device Baseline Tests and number of runs per test type. . . . 31 Table 3.2 NAT Baseline Tests and number of runs per test type. . . 32 Table 3.3 NAT Baseline Tests and number of runs per test type. . . 34 Table 5.1 Energy consumption rates pertaining to NAT app TP

func-tional test for different mobile devices. All measurement units are in mA.Second per second of operation. . . 94 Table 5.2 TP events most probable durations under the NAT app TP

functional configuration for the different mobile platforms. All units are in Seconds. . . 97 Table 5.3 Total energy consumption equation constant CT under NAT

app TP functional configuration for the different mobile devices. The dimension of CT is mA.Second. . . 97 Table 5.4 Energy consumption rates pertaining to NAT app TP+GPS

functional test for different mobile devices. All measurement units are in mA.Second per second of operation. . . 101 Table 5.5 TP events most probable durations under the NAT app TP+GPS

functional configuration for the different mobile platforms. All units are in Seconds. . . 103 Table 5.6 Total energy consumption equation constant CT under NAT

app TP+GPS functional configuration for the different mobile devices. The dimension of CT is mA.Second. . . 104 Table 5.7 Energy consumption rates pertaining to NAT app QoS+TP

functional test for different mobile devices. All measurement units are in mA.Second per second of operation. . . 107 Table 5.8 TP events most probable durations under the NAT app QoS+TP

functional configuration for the different mobile platforms. All units are in Seconds. . . 110

(10)

Table 5.9 Total energy consumption equation constant CT under NAT app QoS+TP functional configuration for the different mobile devices. The dimension of CT is mA.Second. . . 110 Table 5.10 Energy consumption rates pertaining to NAT app QoS+TP+GPS

functional test for different mobile devices. All measurement units are in mA.Second per second of operation. . . 113 Table 5.11 TP events most probable durations under the NAT app QoS+TP+GPS

functional configuration for the different mobile platforms. All units are in Seconds. . . 116 Table 5.12 Total energy consumption equation constant CT under NAT

app QoS+TP+GPS functional configuration for the different mobile devices. The dimension of CT is mA.Second. . . 117 Table 5.13 Energy consumption rates of baseline and “NAT Options OFF”

tests with increase in consumption rate due to the NAT app overhead across four different mobile devices. All units are in mA.Second. . . 120 Table 5.14 Energy consumption rates of baseline, “NAT Options OFF”

and “GPS Option ON” tests with increase in consumption rate due to the GPS overhead across four different mobile devices. All units are in mA.Second. . . 123 Table 5.15 Energy consumption rates of baseline and NAT GUI tests with

increase in consumption rate due to the NAT app GUI overhead across four different mobile devices. All units are in mA.Second. 126 Table 5.16 Total energy consumption equation constant CT for all NAT

app functional configurations for the different mobile devices in addition to the computed average of the CT values. The dimension of CT is mA.Second. . . 127 Table 5.17 The values of Doo, DGP S, DGU I and Phone Baseline energy

con-sumption rates for all of the mobile phones. All measurement units are energy rates of mA.Second per second of operation. . 128

(11)

List of Figures

Figure 2.1 Example of power consumption graphs provided by PowerTutor [1]. . . 10 Figure 2.2 AppScopeViewer with collected sample power consumption data

[2]. . . 11 Figure 2.3 PowerTool software graphical tool. . . 13 Figure 2.4 Monsoor Power Monitor external hardware power supply and

power monitor [3]. . . 14 Figure 2.5 Energy consumption rate probability distributions of a

commu-nity with and without subject app installed [4]. . . 17 Figure 2.6 Energy consumption rate probability distributions of an app

and of an instance of the app collected from a community of users [4]. . . 18 Figure 3.1 Tutela NAT in Foreground Application Mode. . . 28 Figure 3.2 Relative number of mobile devices running a specific Android

OS [5]. . . 36 Figure 4.1 Nexus Galaxy NAT Throughput 10-minute test - Run 1. . . . 40 Figure 4.2 Impact of the Throughput test. . . 41 Figure 4.3 Three NAT Throughput events. . . 43 Figure 4.4 Nexus Galaxy QoS 10-minute test - Run 1. . . 44 Figure 4.5 Two QoS event signals highlighted from Nexus Galaxy QoS

Test Run 1. . . 45 Figure 4.6 Plots of three NAT QoS events. . . 46 Figure 4.7 Three NAT QoS events smoothed out with moving average filter. 47 Figure 4.8 Startup transient mixed with first NAT TP test event. . . 48 Figure 4.9 Nexus Galaxy Throughput test Run-1 with and without moving

(12)

Figure 4.10 Output of matched filter ssing denoised Nexus Galaxy Through-put test Run-1 ith TP template signal. . . 52 Figure 4.11 Zero-Mean QoS template used for matched filter operation. . . 53 Figure 4.12 Nexus Galaxy QoS test Run-1 highlighting regions where QoS

events occur and ones where QoS events do not occur. . . 54 Figure 4.13 Distributions of normalized energy consumption rates for 4

dif-ferent Galaxy Nexus QoS test runs (Run 1, Run 2, Run 3, Run 4). . . 55 Figure 4.14 Distributions of normalized energy consumption rates for the

Galaxy Nexus QoS tests. . . 56 Figure 4.15 Nexus Galaxy average energy consumption rates CDFs for

por-tions containing QoS events and porpor-tions not containing QoS events. . . 57 Figure 4.16 Threshold level determined using water filling algorithm on

Nexus Galaxy Throughput Run-1 test. . . 59 Figure 4.17 Local maximums for output of matched filter from the Nexus

Galaxy Throughput Run-1 test after applying a moving average filter. . . 60 Figure 4.18 Nexus Galaxy Throughput Run-1 test with detected TP events

highlighted in red. . . 61 Figure 4.19 Nexus Galaxy Throughput Run-1 test with highlighted ranges

to be used for the energy consumption analysis. . . 64 Figure 4.20 Normalized energy distributions for Nexus Throughput test runs. 65 Figure 4.21 Galaxy Nexus TP test Run 1 denoised with data points falling

between the current values of interest are highlighted in red. . 67 Figure 4.22 Nexus Galaxy Throughput Run 1 test denoised with moving

average with values falling in the current range extracted from energy consumption rates distributions are highlighted in red. 68 Figure 4.23 TP events durations distribution for the Nexus Galaxy NAT

app Throughput Run 1 test. . . 70 Figure 4.24 Nexus Galaxy NAT app Throughput configuration - TP events

time durations distribution. . . 71 Figure 4.25 Energy consumption rates distributions for Galaxy Nexus

(13)

Figure 5.1 Samsung Galaxy Nexus average energy consumption rates em-pirical CDFs - NAT QoS configuration. . . 80 Figure 5.2 HTC 3D Evo average energy consumption rates empirical CDFs

- NAT QoS configuration. . . 80 Figure 5.3 Motorola XT615 average energy consumption rates empirical

CDFs - NAT QoS configuration. . . 81 Figure 5.4 Samsung Galaxy Note II average energy consumption rates

em-pirical CDFs - NAT QoS configuration. . . 81 Figure 5.5 Samsung Galaxy Nexus average energy consumption rates

em-pirical CDFs - NAT QoS+GPS configuration. . . 82 Figure 5.6 HTC 3D Evo average energy consumption rates empirical CDFs

- NAT QoS+GPS configuration. . . 83 Figure 5.7 Motorola XT615 average energy consumption rates empirical

CDFs - NAT QoS+GPS configuration. . . 83 Figure 5.8 Samsung Galaxy Note II average energy consumption rates

em-pirical CDFs - NAT QoS+GPS configuration. . . 84 Figure 5.9 Samsung Galaxy Nexus average energy consumption rates

em-pirical CDFs - NAT QoS+TP configuration. . . 85 Figure 5.10 HTC 3D Evo average energy consumption rates empirical CDFs

- NAT QoS+TP configuration. . . 85 Figure 5.11 Motorola XT615 average energy consumption rates empirical

CDFs - NAT QoS+TP configuration. . . 86 Figure 5.12 Samsung Galaxy Note II average energy consumption rates

em-pirical CDFs - NAT QoS+TP configuration. . . 86 Figure 5.13 Sample Samsung Galaxy Nexus QoS+TP test run time-series

current recordings signal with QoS and non-QoS regions high-lighted. . . 87 Figure 5.14 Samsung Galaxy Nexus average energy consumption rates

em-pirical CDFs - NAT QoS+TP+GPS configuration. . . 88 Figure 5.15 HTC 3D Evo average energy consumption rates empirical CDFs

- NAT QoS+TP+GPS configuration. . . 89 Figure 5.16 Motorola XT615 average energy consumption rates empirical

CDFs - NAT QoS+TP+GPS configuration. . . 89 Figure 5.17 Samsung Galaxy Note II average energy consumption rates

(14)

Figure 5.18 Samsung Galaxy Nexus energy consumption rate distributions for regions before TP, TP, and after TP events for the NAT App TP configuration. . . 91 Figure 5.19 HTC 3D Evo energy consumption rate distributions for regions

before TP, TP, and after TP events for the NAT App TP con-figuration. . . 91 Figure 5.20 Motorola XT615 energy consumption rate distributions for

re-gions before TP, TP, and after TP events for the NAT App TP configuration. . . 92 Figure 5.21 Samsung Galaxy Note II energy consumption rate distributions

for regions before TP, TP, and after TP events for the NAT App TP configuration. . . 92 Figure 5.22 Samsung Galaxy Nexus TP events duration distributions from

matched filter procedure and refined approach for the NAT app TP configuration. . . 95 Figure 5.23 HTC 3D Evo TP events duration distributions from matched

filter procedure and refined approach for the NAT app TP con-figuration. . . 95 Figure 5.24 Motorola XT615 TP events duration distributions from matched

filter procedure and refined approach for the NAT app TP con-figuration. . . 96 Figure 5.25 Samsung Galaxy Note II TP events duration distributions from

matched filter procedure and refined approach for the NAT app TP configuration. . . 96 Figure 5.26 Samsung Galaxy Nexus energy consumption rate distributions

for regions before TP, TP, and after TP events for the NAT app TP+GPS configuration. . . 98 Figure 5.27 HTC 3D Evo energy consumption rate distributions for regions

before TP, TP, and after TP events for the NAT app TP+GPS configuration. . . 99 Figure 5.28 Motorola XT615 energy consumption rate distributions for

re-gions before TP, TP, and after TP events for the NAT app TP+GPS configuration. . . 99

(15)

Figure 5.29 Samsung Galaxy Note II energy consumption rate distributions for regions before TP, TP, and after TP events for the NAT app TP+GPS configuration. . . 100 Figure 5.30 Samsung Galaxy Nexus TP events duration distributions from

matched filter procedure and refined approach for the NAT app TP+GPS configuration. . . 102 Figure 5.31 HTC 3D Evo TP events duration distributions from matched

filter procedure and refined approach for the NAT app TP+GPS configuration. . . 102 Figure 5.32 Motorola XT615 TP events duration distributions from matched

filter procedure and refined approach for the NAT app TP+GPS configuration. . . 103 Figure 5.33 Samsung Galaxy Nexus energy consumption rate distributions

for regions before TP, TP, and after TP events for the NAT app QoS+TP configuration. . . 105 Figure 5.34 HTC 3D Evo energy consumption rate distributions for regions

before TP, TP, and after TP events for the NAT app QoS+TP configuration. . . 105 Figure 5.35 Motorola XT615 energy consumption rate distributions for

re-gions before TP, TP, and after TP events for the NAT app QoS+TP configuration. . . 106 Figure 5.36 Samsung Galaxy Note II energy consumption rate distributions

for regions before TP, TP, and after TP events for the NAT app QoS+TP configuration. . . 106 Figure 5.37 Samsung Galaxy Nexus TP events duration distributions from

matched filter procedure and refined approach for the NAT app QoS+TP configuration. . . 108 Figure 5.38 HTC 3D Evo TP events duration distributions from matched

filter procedure and refined approach for the NAT app QoS+TP configuration. . . 108 Figure 5.39 Motorola XT615 TP events duration distributions from matched

filter procedure and refined approach for the NAT app QoS+TP configuration. . . 109

(16)

Figure 5.40 Samsung Galaxy Note II TP events duration distributions from matched filter procedure and refined approach for the NAT app QoS+TP configuration. . . 109 Figure 5.41 Samsung Galaxy Nexus energy consumption rate distributions

for regions before TP, TP, and after TP events for the NAT app QoS+TP+GPS configuration. . . 111 Figure 5.42 HTC 3D Evo energy consumption rate distributions for

re-gions before TP, TP, and after TP events for the NAT app QoS+TP+GPS configuration. . . 111 Figure 5.43 Motorola XT615 energy consumption rate distributions for

re-gions before TP, TP, and after TP events for the NAT app QoS+TP+GPS configuration. . . 112 Figure 5.44 Samsung Galaxy Note II energy consumption rate distributions

for regions before TP, TP, and after TP events for the NAT app QoS+TP+GPS configuration. . . 112 Figure 5.45 Samsung Galaxy Nexus TP events duration distributions from

matched filter procedure and refined approach for the NAT app QoS+TP+GPS configuration. . . 114 Figure 5.46 HTC 3D Evo TP events duration distributions from matched

filter procedure and refined approach for the NAT app QoS+TP+GPS configuration. . . 114 Figure 5.47 Motorola XT615 TP events duration distributions from matched

filter procedure and refined approach for the NAT app QoS+TP+GPS configuration. . . 115 Figure 5.48 Samsung Galaxy Note II TP events duration distributions from

matched filter procedure and refined approach for the NAT app QoS+TP+GPS configuration. . . 116 Figure 5.49 Samsung Galaxy Nexus energy consumption rates distributions

for phone baseline and “NAT Options OFF” tests. . . 118 Figure 5.50 HTC 3D Evo energy consumption rates distributions for phone

baseline and “NAT Options OFF” tests. . . 118 Figure 5.51 Motorola XT615 energy consumption rates distributions for

phone baseline and “NAT Options OFF” tests. . . 119 Figure 5.52 Samsung Galaxy Note II energy consumption rates

(17)

Figure 5.53 Samsung Galaxy Nexus energy consumption rates distributions for phone baseline, “NAT Options OFF” and “GPS Option ON” tests. . . 121 Figure 5.54 HTC 3D Evo energy consumption rates distributions for phone

baseline, “NAT Options OFF” and “GPS Option ON” tests. . 122 Figure 5.55 Motorola XT615 energy consumption rates distributions for

phone baseline, “NAT Options OFF” and “GPS Option ON” tests. . . 122 Figure 5.56 Samsung Galaxy Note II energy consumption rates

distribu-tions for phone baseline, “NAT Opdistribu-tions OFF” and “GPS Op-tion ON” tests. . . 123 Figure 5.57 Samsung Galaxy Nexus energy consumption rates distributions

for phone baseline and NAT GUI tests. . . 124 Figure 5.58 HTC 3D Evo energy consumption rates distributions for phone

baseline and NAT GUI tests. . . 125 Figure 5.59 Motorola XT615 energy consumption rates distributions for

phone baseline and NAT GUI tests. . . 125 Figure 5.60 Samsung Galaxy Note II energy consumption rates

distribu-tions for phone baseline and NAT GUI tests. . . 126 Figure 5.61 Estimated minutes of normal phone operations lost due to

en-ergy consumption of running Tutela NAT app for 30 minutes with TP feature enabled but GPS disabled and NAT app in a Background Service mode, where the GUI interface is not visi-ble, across different smartphone devices with varying TP period. 129 Figure 5.62 Estimated minutes of normal phone operations lost due to

en-ergy consumption of running Tutela NAT app for 30 minutes with TP feature and GPS enabled and NAT app in a Back-ground Service mode, where the GUI interface is not visible, across different smartphone devices with varying TP period. . 130 Figure 5.63 Estimated minutes of normal phone operations lost due to

en-ergy consumption of running Tutela NAT app for 30 minutes with TP feature and GPS enabled and NAT app in a Fore-ground Application mode, where the GUI interface is visible, across different smartphone devices with varying TP period. . 131

(18)

ACKNOWLEDGEMENTS

I would like to thank Tutela Technologies Inc. and its team for their time and for providing their platform to perform this study. Also, I would like to thank Wavefront Inc. for their collaboration that helped in completing this study. In addition, I thank Dr. Ted Darcie and Dr. Stephen Neville for their supervision of the work and for their time and effort contribution. Finally, I thank MITACS-Accelerate for their financial grants making this work possible.

(19)

DEDICATION

To the Syrian revolution and the martyrs of the Middle East uprisings. In the past two years, not only I learned a lot about technical and business issues, but also about global politics, non-profit relief efforts, and non-violence resistance. I felt the

sense of humanity connectedness and was honoured to contribute to it. If you do not know yourself, you would not be able to know your fellow human.

(20)

Introduction

Mobile devices are increasingly becoming essential elements in people’s lives. The mobility factor combined with advanced technology and simplicity of use is changing the way people interact with their mobile devices. This behavior change includes how end-users make a purchase [6] [7], what device type they prefer to use [8] [9], and how they interact with mobile applications [10] [11] [12]. The smartphones market is also witnessing not only an explosive increase in number of shipped devices, but also a very fierce competition between the large players. Players in the smartphones market include device manufacturers, e.g. Apple and Samsung, but also operating systems makers, e.g. Microsoft and Google. The competition is also moving from the mobile device itself to the features that are associated with it. Features such as smartphone applications and app stores create a strong appeal for acquiring newly released devices.

The intimate experience that mobile users have with their mobile devices is en-hanced by mobile applications. It is estimated that 56 billion smartphone apps and 14 billion tablet apps will be downloaded this year [13]. Also, it is estimated that about 80% of minutes spent using mobile devices is spent on mobile applications [10]. Mobile applications cover a wide range of functionality and purposes for the user including entertainment, communication, productivity, fitness, etc. However, mobile devices are starting to include tools and applications that perform tasks which do not provide direct utility to the end-user. User experience and service-of-quality tracking functionalities are of interest to the cellular wireless providers.

Cellular wireless providers are increasingly needing to provide seamless mobile services. Weak 3G wireless signal, dropped calls, and lost text messages negatively affect the reputation of the provider. As a result, mobile user experience is a central

(21)

issue of concern for cellular carriers. Quality of service improvements bestow high costs on the provider. Installing new towers, upgrading equipment, and maintaining existing infrastructure are associated with demanding expenses. However, even if the wireless service provider performs all the aforementioned tasks, the quality of service detected by the wireless infrastructure equipment might be different from what the user is actually experiencing at their end. The strong need for user experience analytics, and high cost of maintaining high quality of service, have opened new opportunities and markets to solve this complex problem.

Mobile applications, embedded tools, and SDKs are developed to help mobile manufacturers, wireless service providers and software developers understand the user experience. Data is collected and sent over the network, cellular or wireless, to a remote server for analysis. Minimal processing and packaging of the data is performed on the end node, i.e. mobile device, to mitigate the impact on the system. The advancement of hardware technology, such as CPU and memory, for mobile phones makes such data collection less disruptive to the user experience. Mobile CPUs have high processing power and memory chips can hold large amount of data. However, the limiting factor for mobile devices is the battery capacity. Optimizing the charge life for the mobile batteries continues to be a fundamental challenge affecting the user experience. Since mobile apps consume most of time spent by mobile user on their device, then understanding and accounting for the energy impact of the mobile apps is important.

This thesis looks at one of the companies that provides a solution for wireless service providers to understand the quality of service. The work studies the energy impact of the mobile app component on the mobile device. This chapter covers a background on the company, Tutela Technologies Inc., providing the solution and covers the overall functionality of the mobile app. Also, this chapter introduces the potential impact this solution poses on the end user. The research scope and the parters that are collaborating for this research are indicated. Finally, the thesis statement and contribution, and the outline of the thesis dissertation are presented to conclude this chapter.

1.1

Network Assessment Application

Tutela Technologies Inc. provides an infrastructure-less solution to the wireless ser-vice providers allowing the serser-vice providers to understand the quality of serser-vice from

(22)

the end-user perspective. The solution, Network Assessment Tool (NAT), comes in the form of a native mobile application directly installed on the mobile device. Also, it comes in the form of a development SDK that allows developers to embed its features and functionalities in their mobile applications. To make sense of the information col-lected from the user base, Tutela provides a web-based Network Analysis Dashboard (NAD) that can be accessed by the cellular carriers. The dashboard provides multi-ple functionalities including customer care, network assessment and network planning capabilities.

The client-side component of the Tutela solution, i.e. NAT app, performs periodic tests to collect the required data for quantifying the user quality of experience. This data is minimally processed and packaged before being transferred over the wireless network to a remote server. The NAT app provides multiple tests that can be run un-der various configurations. This thesis presents a study of the energy consumption of the NAT app operations under each configuration. This study helps in understanding the impact of each possible configuration on the overall system battery, and hence on the user experience. The scope of this research is focusing specifically on the NAT mobile app for the Android phones. Focusing on the app as opposed to the SDK al-lows us to not only study the impact of the features of the app, but also to understand the impact of the app GUI interface overhead. The focus on the Android market is due to the dominance of Android OS devices in the smartphones market[14][15].

1.2

Problem Scope

Batteries are a major limiting factor for mobile devices and an important contributor to the overall user experience. Mobile users spend most of their mobile media time interacting with mobile apps. Mobile apps can have multiple distinct features or operations that vary in their energy impact on the overall system. However, the mobile app is a closed-source software that can only be treated as a blackbox. As a result, the app code cannot be probed for granular analysis. However, statistical methodologies can be used to extract the energy impact of the app features. This work aims to provide an experimental and statistical analysis methodologies to deal with this challenge. To demonstrate the methodology, a user experience analytics mobile app, i.e. Tutela NAT app, is used. The NAT app provides distinct features that can be enabled independently from each other.

(23)

1.3

Problem Statement and Thesis Contributions

This goal of this thesis is to answer the following question: What is the energy impact of Tutela NAT app features on the mobile device and can this impact be modelled using mathematical equations to be used for all users of the NAT app?

The contribution of this thesis can be summarized in the following:

• Develop an experimental design and statistical analysis methodology for collect-ing mobile device energy-related measurements, and analyscollect-ing data for energy profiling.

• Determine the overall energy impact of experience app features on the mobile device across 4 different mobile platforms and model the consumption impact through mathematical equations pertaining to each configuration.

1.4

Research Partners

Tutela NAT app is to be deployed to millions of mobile phones. Therefore, it is im-portant, from a purely business and marketing perspective, to have a well established partner involved in the process of impact analysis. The experiment to collect energy consumption data is designed in collaboration with Wavefront Inc. Wavefront pro-vides controlled experimental laboratories and vast experience in the field of mobile phone testing. They also provide the smartphones that the app is tested on. The details of the collaboration are explained in a later section. Also, working very closely with Tutela helps in determining the best energy consumption measurement metrics. These metrics are used by Tutela to design a solution that the client is satisfied with.

1.5

Thesis Outline

This chapter provided an introduction to the work and the scope of the problem to be solved. The relationship with the involved entities and the collaboration rationale was discussed. Finally, the problem statement and the thesis contribution were provided. The remaining of this work is organized as follows:

Chapter 2 discusses existing work in the field of mobile energy and power profiling including power measurement tools and overarching methodologies.

(24)

Chapter 3 presents experimental design and assumptions made to collect required data about mobile device energy consumption and impact of the mobile expe-rience app.

Chapter 4 illustrates the statistical testing methodologies applied to collected data to determine the energy impact of the experience app.

Chapter 5 presents and analyses the energy impact results from the different plat-forms and testing configurations.

Chapter 6 summarizes the contribution of this work, outlines potential future work, and provides a conclusion to this thesis.

(25)

Chapter 2

Related Work

This chapter presents some of the work done in the area of power and energy mea-surement and profiling for mobile devices and smartphones. The research and devel-opment space is large with many differing approaches and terminologies. The large contribution is a direct result of the complexity of determining the energy drainage on mobile devices, especially with the quickly changing and advancing hardware and software technologies. Energy measurement and profiling generally face two polar op-posite challenges: the first challenge is the accuracy and granularity of the profiling, and the second challenge is the ease of implementation and use.

Obtaining accurate measurements with very low errors, although dependent on the application and environment, requires involved processes and algorithms. Modern smartphone OSes have complicated and involved built-in power management poli-cies [16]. These polipoli-cies can make it difficult for researchers to accurately account for power consumption due to the interference in the power state of the device com-ponents. For example, if the profiling method does not provide an external power supply that by passes the phone battery, then at low battery levels the OS power management policies aggressively shut down some system components to increase the life-time of the battery. This process can give wrong benchmarking on the expected energy consumption in normal situations. Methods involving bypassing the Lithium-ion batteries are not portable nor scalable. To allow average mobile users and mobile app developers to profile mobile energy, power management system developers create easy to use tools and mobile applications to perform these tasks.

For the purposes of this research, the focus is narrowed down to the work and tools that involve Android mobile devices and smartphones. This chapter covers the concept of energy bugs and energy hogs in relation to mobile apps. It also explore

(26)

some of the state-of-the-art systems used for mobile energy profiling and presents relevant concepts and methodologies used for energy profiling. In addition, this chap-ter discusses the issue of asynchronous power behavior and how it complicates the effort of attributing energy consumed by an entity, hardware or software. Finally, this chapter briefly explain the issue arising when profiling energy while powering the mobile phone using a battery.

2.1

Mobile Apps Energy Bugs and Energy Hogs

Modern smartphones are becoming increasingly more powerful with more memory, faster CPUs and stronger GPUs, but they are still limited by the capacity of the battery. The mobile applications designed and implemented for gaming, finance, nutritional watch, international communication, advertising, music, TV, etc. have placed the smartphones in the center of the end-user’s life. The functionality of the mobile apps and services they provide increasingly require more resources from the mobile phone. This includes higher CPU utilization, more memory usage and faster wireless communication radio. Mobile users spend majority of their time using mobile apps. This means that the utility derived from the mobile apps is also heavily affected by the battery capacity. In essence, it is important to understand the energy consumption of modern smartphone applications.

A new type of research has emerged looking into energy drainage specifically for smartphones and mobile devices. The term Energy Bugs, or ebugs, has been introduced to label abnormal system behaviors that result in higher than usual energy drainage. Abhinav Pathak in [17] presents an extensive taxonomy of the types of energy bugs by mining 1.2 million posts from various online mobile user forums and OS bug repositories. Energy bugs can happen as a result of errors in any aspect of the mobile device, hardware or software, whether externally or internally triggered [17]. As an example, a hardware ebug might arise because of a faulty mobile battery that is unable to hold the charge resulting in faster depletion. Also, an old SIM card with scratches and bends and worn-out electrical surface contacts results in energy leaks, and, hence, an ebug by definition [18]. The ebug can also be caused by an external hardware error such as a faulty charger. Software ebugs can be caused by errors in the OS, firmware, and mobile apps [17]. This taxonomy not only shows the difficulty of foreseeing and predicting an ebug, but also the large number of potential causes that can initiate such an abnormal drainage behavior.

(27)

Adam J. Oliner in [19] explores the energy drainage behavior specifically of mobile applications, and classifies apps as either having an energy bug, or being an energy hog. He defines energy bugs for mobile apps as being any activities that are not intrinsic to the app function but resulting in energy drainage. So an application has an ebug if that instance of the app drains more energy than other instances of the same app, i.e., if the drainage behavior manifests itself in some instances of the app as opposed to all of the instances. For example, if one instance of an app goes into an infinite loop as a result of a condition triggered by Android Jellybean OS while it is handling GPS location, the app is classified as having an energy bug due to that misbehavior. The condition that triggered the infinite for-loop was not part of the app and, hence, not intrinsic to its functionality. Conversely, the app is an energy hog if, for all its instances, it consumes more energy than an average app. The average app consumption is computed from data samples collected from mobile phones that are not running the app under testing. As an example, if the usage of an advanced 3D game on the mobile device consumes more energy for all of its instances than the average energy consumed by applications running on the mobile devices not including the 3D game app, then that app is classified as an energy hog. Oliner used a community of mobile users collaboratively testing 2664 apps using a tool called Carat [19]. Although the community consisted of independent members, the experiment energy data recordings collected using Carat have been proven to be similar to those collected from a controlled lab environment collected using the Monsoon Power Monitor [4]. However, Oliner focused on measuring the drainage of the apps as a whole as opposed to measuring energy drainage of specific features of the applications, which is what this work is aiming to do. An application feature can be defined as a collection of tasks, e.g. system calls, aimed to achieve a single function.

In [20], Abhinav Pathak, Y. Charlie Hu and Ming Zhang design and implement a fine-grained methodology to account for energy consumption for mobile apps. They set to find out where the energy is spent for performing actions such as Google search-ing on the browser, uploadsearch-ing a photo album, ussearch-ing a location app, syncsearch-ing a file to the mobile phone, playing a game such as AngryBirds, etc. Eprof, a fine-grained energy profiler for smartphone apps, is used to achieve this task. Different activi-ties and apps were tested on 3 different mobile devices running Windows Mobile and Android OSes. The results from Eprof energy accounting policies were compared to results from utilization-based and time-split accounting policies, both explained

(28)

in Section 2.3.2. The accuracies of accounting are found to be drastically different for the three accounting methods, and for the different accounting granularities for the utilization-based accounting scheme, i.e. process, thread, routine and system-call accounting granularities. The approach followed by the researchers, although success-ful, involved fine-granular accounting that does not allow for explicitly determining an app feature.

2.2

Energy Profiling Tools

There exist many tools to measure energy consumption of software entities, e.g. mo-bile apps, and hardware components, e.g. GPS, for momo-bile devices. This section presents some of the state-of-the-art and popular tools used by researchers. The se-lected tools cover a range of energy accounting methods. The following sections cover Trepn Profiler, PowerTutor, AppScope and Monsoon Power Monitor.

2.2.1

Trepn Profiler

Qualcomm has developed the Trepn Profiler as a diagnostic tool that helps the mobile app developers profile system performance and power consumption. Some of the metrics tracked by Trepn Profiler include CPU usage and frequency, memory usage, screen state, Wi-Fi RSSI level and battery level. Trepn is capable of doing system-level or app-system-level profiling for devices featuring Qualcomm Snapdragon processors, common to many mobile devices currently on the market. Trepn can be controlled externally, which aids in automating testing. In addition, Trepn allows the tester to export a data file containing all the collected measurements to be externally analyzed for app optimization.

Concerns with the Trepn Profiler is that it uses hardware sensors to collect record-ings for its metrics and it uses information from the internal power management In-tegrated Circuit (IC) to perform the battery level measurement. The problem with this approach is that many modern smartphones come without an internal power management IC. When the required IC is missing, Trepn automatically disables the battery level measuring feature. Modern smartphones use complex algorithms to calculate the battery level. When used with smartphones lacking the power manage-ment IC, Trepn disables the battery level measuremanage-ment on the device. In addition,

(29)

this approach is inadequate for measuring the energy consumption of the mobile app features.

2.2.2

PowerTutor

PowerTutor is an mobile app specifically made for Android phones to allow app de-velopers and system testers to view the power consumption of major system compo-nents. The tool displays, in real-time, the power consumption by the system hardware components (such as CPU, GPU, Wi-Fi Interfaces) and software applications. Power-Tutor uses power consumption models that provide estimates of power consumption within 5% of their actual values [21]. In addition, the PowerTutor app provides the tester with a text log file containing detailed testing results. Figure 2.1 shows an example of the graphs PowerTutor provides.

Figure 2.1: Example of power consumption graphs provided by PowerTutor [1].

PowerTutor has been designed to use power models for estimating power consump-tion. The available power models only support the HTC G1, HTC G2 and Nexus

(30)

One smartphones. It cautions the user that its results may be inaccurate whenever used on other mobile platforms.

2.2.3

AppScope

AppScope [22] is an Android-based energy measuring system to estimate energy con-sumption of mobile devices. It monitors the mobile app usage of hardware compo-nents at the kernel level and uses an event-driven monitoring method. The system utilizes power models and usage statistics for each mobile hardware component that it performs energy metering for. The AppScope suite currently comes with AppScope-Viewer, which is a runtime Java application that interacts with AppScope in the target device to produce a graphical energy profile. Figure 2.2 shows AppScopeViewer with a sample of power consumption data. The suite comes also with DevScope, which is an online tool for smartphone hardware components power analysis. It is nonintrusive in a sense that it uses power modelling techniques at run-time without requiring any external devices, artificial interventions, or modifications of mobile OS [23].

Figure 2.2: AppScopeViewer with collected sample power consumption data [2].

A major concern with AppScope is it requires the rooting of the mobile device. Rooting means to acquire a root, i.e. administrator or superuser, system access. In

(31)

order to probe the kernel activity, AppScope is implemented as a loadable kernel module which needs to be compiled with specific flags turned on. Hence, the tester needs to unlock the bootloader, requiring rooting the device, and flash the AppScope kernel image before starting to collect measurements. Rooting devices violates end-user agreements and is not a suitable commercial approach.

2.2.4

Monsoon Power Monitor

The Monsoon Power Monitor, developed by Monsoon Solutions Inc., is a hardware and software -based system that allows the tester to collect power consumption in-formation from any mobile phone that uses a single lithium (Li) battery technology. The hardware component is an external device that acts as a voltage-adjustable power supply, and a current and voltage monitor. The power monitor is connected to the mobile device battery contacts and to a PC to interact with the software component. The software component, PowerTool, is a Windows-PC application that provides a graphical representation of the results and tools to perform basic power calculations. Figure 2.4 and Figure 2.3 show the components of the Monsoon Power Monitor sys-tem.

(32)

Figure 2.4: Monsoor Power Monitor external hardware power supply and power monitor [3].

The Monsoon system requires bypassing the phone battery by connecting the phone battery contacts to the Monsoon power supply probes and electrically insu-lating the terminals of the battery. The approach provides global measurements of power consumption, which means that measurements indicates the overall system energy drainage. This system is used by various testers and researchers for its ease-of-use and accuracy [20][24][25][26]. This system provides readings at 5,000 samples per second, which detect small and short changes in power consumption due to mo-bile device activities. The limitations of this approach is that it can only be used on mobile devices with accessible removable batteries. Of the standard approaches, the Monsoon Power MOnitor best fits this thesis research needs.

2.3

Energy Profiling Methodologies

This section looks at most well-known methodologies used by testers and researchers to profile energy and power consumption on mobile devices. The idea behind this survey is to give some perspective on the complexity and variety of approaches that

(33)

have been proposed. The selected approaches are not meant to be comprehensive, however, they are meant to help in showcasing popular ones and the objectives each approach serves. It is worth noting that these methodologies are not mutually exclu-sive and combinations of them can be used. This section covers collaborative energy diagnosis, utilization-based, split-time, and system-call level approaches.

2.3.1

Collaborative Energy Diagnosis

Collaborative diagnosis requires the involvement of a community of users to collect reference usage data and app-under-test usage data. Using statistical means, an error, or ebug, can be determined through calculating a deviation from the expected behavior, i.e. the reference. There hasn’t been, according to our best knowledge, many available implementations of collaborative methodologies for mobile devices diagnosis, specifically focusing on energy profiling measurements. The lack in this method is due to the complexity of the approach and the large overhead time required to gather the community members. The two successfully implemented and used systems are IQ Agent and Carat.

One of the earliest and widely spread systems used for collaborative mobile de-vice performance analysis is IQ Agent, which is privately developed and owned by Carrier IQ. IQ Agent is a tool that provides network operators, i.e. carriers, with diagnostic services to help identify how wireless networks are performing. This helps network operators strategically plan projects, such as new tower installations, based on the feedback from the mobile devices community on the network. The information provided by IQ Agent can also be used for customer care by providing the customer representative with information about the activities happening on the mobile de-vice. The representative can see information such as history of dropped calls, signal strengths, app installations and battery levels. A customer may call complaining about the short life span of his mobile battery and the representative would identify the source of the problem as, for example, a mobile app with high battery drainage that has been recently installed on the device [27]. IQ Agent collects diagnostic in-formation from millions of phones allowing the carrier to diagnose various problems. This tool is privately owned and is not available for testing.

Carat [28] is another profiling tool specifically focused on energy and battery usage, and adopts a collaborative diagnosis methodology. Carat is a free mobile app started by a research project at UC Berkeley aiming to provide mobile end-users

(34)

with analytics about their battery usage. Through the collected information from thousands of devices, Carat profiles mobile processes and applications and provides the end-user with actionable items to minimize battery consumption. The app bases its recommendation on abnormalities identified through statistical analysis. Such actions include killing a specific app, e.g. App X, or upgrading the OS. Adam J. Oliner in [19] explains that Carat works by comparing the probability distributions of energy usage rates. A reference probability distribution is created from the community data collected by Carat. Also, energy probability distributions are created for each app from the energy consumption data collectively gathered from the users with Carat installed on their phones. There are two methods of comparisons that are followed: comparing the energy consumption of the app to the reference distribution, and comparing energy consumption of every instance of the app to the collective app energy consumption distribution. A mobile app is considered an energy hog if that app has higher energy consumption rate than other apps. An instance of a specific app is considered buggy, has an ebug, if that instance results in higher energy consumption rate than all other instances of the same app. Figure 2.5 and Figure 2.6 explain this concept further.

(35)

Figure 2.5: Energy consumption rate probability distributions of a community with and without subject app installed [4].

The expected energy consumption rate is computed for the reference energy con-sumption probability distribution and the mobile app energy concon-sumption probability distribution. The reference energy probability distribution is the energy consumption rate probability distribution when the mobile app under testing is not running on the device. If the expected energy consumption rate of the subject app energy probability distribution is significantly higher than the expected energy consumption rate of the reference distribution, then the subject app, e.g. App X, is classified as an energy hog. Statistical methods are used to compute the relative distance D between the expected energy consumption rates of the distributions and the distance significance.

(36)

Figure 2.6: Energy consumption rate probability distributions of an app and of an instance of the app collected from a community of users [4].

A specific instance of the mobile app under testing, e.g. App X running on Android 4.0 on Samsung Note II platform, might result in higher energy consumption rate than other instances of the same app. In this case, the instance with higher than usual energy consumption rate is said to have an energy bug, or ebug. Carat also uses energy consumption probability distributions to classify app instances containing ebugs. An energy consumption probability distribution for each instance is computed and compared to the energy consumption probability distribution of all instances. The expected energy consumption rate of the app instance is compared to the expected energy consumption rate of all instances. If the distance D between the expected energy consumption rates is statistically significant, then the instance under testing is classified as buggy.

Agent IQ and Carat are not used for achieving the objective of this thesis work. The aim behind this work is to study the energy consumption of the different app features as opposed to studying the whole app energy consumption. The available collective approaches described in this section do not provide this capability. However,

(37)

this work draws upon some of the statistical means used by Carat. Carat compares the expected energy consumption rates of energy consumption probability distributions. The statistical method developed for this work and detailed in Chapter 4 also uses energy consumption probability distributions and compares most probable energy consumption rates.

2.3.2

Utilization-Based and Time-Split Accounting Policies

Two common energy accounting policies are utilization-based and split-time [20]. Utilization-based accounting policies are based on periodically checking the utiliza-tion of the mobile hardware components, such as CPU, and using models to correlate the utilization with energy drainage. PowerTutor and Carat both use utilization-based approaches to perform energy profiling. Split-time accounting policies divide the device running time into fine-grained fixed-sized bins and measure the energy con-sumption for each bin duration. The policies then associate the energy concon-sumption of each time bin with the entity undergoing energy profiling, e.g. process, routine, thread, etc. Usually, the device global energy consumption measurements are done using an external monitoring device. PowerScope[24] is one energy profiling system that uses split-time accounting policy and captures energy recordings at each time bin via an external power monitor. Utilization-based energy accounting approaches have been discussed by many researchers [29][30][31][32]. Also, split-time accounting approaches have been used by many as well [24][33][34][35].

Both utilization-based and split-time accounting policies are problematic in mod-ern smartphone energy profiling. Abhinav Pathak in [36] argues that utilization-based power accounting uses power models that correlate the utilization of a mobile hard-ware component to the power state of that component. This approach periodically samples the hardware component usage counter. As a result, a utilization-based approach fails in accurately accounting energy drainage resulting from lower power states or tail power states because the power models assume utilization only when the component is in high power state. In addition, this approach fails in correctly accounting energy consumption resulting from routines that are shorter in duration than the utilization counters sampling period. If the routine starts and finished in less time than the period duration of the utilization counter sampling frequency, then this routine activity is not detected. In [20], Pathak performs a study of the error associated with utilization-based and split-time policies. He states that inaccuracy

(38)

in the power models used by utilization-based approaches is a contributing factor to the error in energy accounting. He also finds that split-time accounting policies have large errors due to the oblivious nature of this approach to the granularity of what is running on the mobile phone. Using split-time approaches does not help differen-tiate between threads, processes and routines. Split-time also fails in accounting for asynchronous power behavior, explored in section 2.4, because it is purely dependent on dividing the time interval into small bins without direct knowledge of when the entity under testing, e.g. routine, begins and ends.

For the purposes of this research, utilization-based accounting approaches are not used to avoid skewing the results. Utilization-based approaches require sampling the hardware utilization counters periodically, which in itself results in energy consump-tion. In addition, our work looks at multiple mobile device platforms that do not have available accurate power models to be used for utilization-based approaches. How-ever, statistical analysis combined with split-time accounting policies can be used to achieve the objectives of this research. By measuring the instantaneous global en-ergy consumption of the device and applying statistical analysis, features resulting in various energy consumption rates within the same app can be extracted.

2.3.3

System-Call Based

System-call-based profiling is an energy profiling method that uses system calls as power consumption sampling triggers and power models as a reference to estimate the energy consumption of the mobile phones. Each system call can be traced to the routine or process that is triggering it. This tracing allows for fine-grained energy accounting. In addition, software entities, such as routines, processes and threads, gain access to mobile hardware components only through system calls. Therefore, the name of the system call and the parameters associated with it provide sufficient information about which hardware component is being used. Utilization-based en-ergy accounting approaches use power models that correlate hardware component utilization to the hardware component power state. This task is done by periodically sampling the hardware component utilization counter. As a result, utilization-based approaches are only accurate in accounting power consumption of utilization based behaviors. This means, non-utilization based power activities, such as turning the camera on or running the GPS, are not accounted for using utilization-based ap-proaches due to lack of a utilization counter to be sampled. However, system-call

(39)

power accounting approaches can detect non-utilization based activities by analysing the parameters associated with each system call. A specific routine can perform a system-call to turn the camera on the mobile device on, and the system-call power accounting approach can detect this request and associate power consumed due to this activity with the routine initiating it. System-call energy accounting approaches for mobile phones have been discussed by some researchers [20][36].

The scope of this research work requires treating with Tutela NAT experience app as a blackbox. System-call approaches require instrumenting the mobile app to determine which routine performed the request. Also, system-call approaches are too fine-grained for measuring app features. Each of the mobile app feature might involve multiple routines and processes, however the goal is to attribute power consumption to the feature as a whole as opposed to contributing components of the feature. Hence, system-call power accounting is not used for this research work.

2.4

Asynchronous Power Behaviour

Mobile phones hardware components, such as GPS, can switch between multiple power states. The power drainage is different in each of these states. Each hardware component has a base state at which the component consumes the minimum rate of energy regardless of the state of the phone overall activities and regardless of the other hardware components power state [20]. For example, the typical wireless 3G radio on the mobile phone has three states: standby, low power and full power [37]. The standby state is the base state where the radio consumes near zero energy. The full power and low power states are active states for which the radio is in use with low power state consuming less energy than the full power states.

Modern smartphones exhibit what is known as asynchronous power behavior where the power impact of a hardware component utilized by an entity such as a routine persists even after the routine is completed. For example, a process can uti-lize the 3G wireless radio to perform a file transfer putting the radio in the full power state. After the completion of the transfer, however, the radio might remain in its full power state consuming energy at a high rate even though it isn’t being utilized. Modern smartphones employ power management policies that minimize overall en-ergy consumption but can also contribute to the asynchronous power behavior. The hardware component consumes higher energy when in an active state, but also con-sumes energy when transitioning between states, such as waking up from standby

(40)

power state to high active power state. Hence, power management policies in modern smartphones attempt to efficiently balance the number of power state transitions and duration of time that the component remains in an active power state. If a 1MB file is transferred in 10 chunks with the radio wireless changing power states between each chunk, e.g. from full power to standby and then full power again, then the total energy consumed might be more than that of keeping the radio in a full power state until the completion of the transfer. This tail power consumption behavior after the completion of the transfer poses some challenges for energy profiling and accounting methodologies.

Wakelocks also pose an asynchronous power behavior that is challenging for energy accounting policies [20]. Modern mobile OSes apply aggressive energy management and sleeping policies to save energy. Hence, if an entity requires keeping a mobile phone component on regardless of what other activities are happening on the phone, it needs to turn the wakelock on. The issue arises here because the entity finishes its tasks without releasing the wakelock. As a result, the hardware component remains in an active state until another process releases the wakelock. Energy consumption accounting for such a situation becomes difficult because tracking entities acquisition and/or releasing of the wakelock is not consistent. This situation is similar in other non-utilization components such as the camera and accelerometer. Once these com-ponents are turned on by an entity, they remain on consuming energy until they are turned off even if they are not being utilized. This reality can lead to a situation where a camera is turned on by an app, utilized, but then not turned off. Another app might start utilizing the camera while it is on afterward. Energy consumption accounting of each app usage of the camera is challenging.

Utilization-based approaches assume utilization level and power state correlation. The correlation models assume a certain level of hardware component utilization level based on the power state that component is in. Hence, such models fails at accurately accounting the energy resulting from the asynchronous power behavior exhibited by non-utilization based hardware components, such as the camera and accelerometer. It is not possible to account for energy consumption of non-utilization-based mobile hardware components due to the lack of a component utilization counter [36]. Split-time energy accounting approaches also face difficulty due to their oblivious nature to the granularity of the software entities that use the hardware components. Split-time accounting policies divide the Split-time into fixed-sized bins and measure device global energy consumption using an external monitoring device. Even if the mobile

(41)

device is instrumented to detect what entity is utilizing what component, the tail power will not be accounted for properly. If the entity completes without turning the hardware component off, then the component continues to consume energy when it is not expected to be utilized. Finally, asynchronous power behavior still poses accounting challenged even if system-call energy accounting approaches is used. A mobile app can send an I/O call to turn the mobile camera on without turning it off after completing the camera-utilizing tasks. The camera continues to drain energy from the battery even though it isn’t being utilized. Another app starts utilizing the camera later on and turns it off when it is done. However, the challenge of associating the energy drainage of the camera with the appropriate app remains.

For the purposes of this research work, computing energy drainage associated with app features using global device-level energy monitoring and statistical approaches eliminates the affect of asynchronous power behavior. Main app features can uti-lize various hardware components and software entities while our accounting method abstracts the details. Using external global device-level energy monitoring means that the impact on the overall device is considered as opposed to the impact of spe-cific components. To classify the power impact, statistical and pattern recognition approaches are used to extract each feature from the collected data.

2.5

Battery Profiling

Taking battery into consideration when profiling a mobile app poses another chal-lenge as a result of the battery life state. Ravishankar Rao in [38] explains some of the factors that affect the discharge characteristics of Lithium-ion batteries. He indicates that with the increase in discharge rate, e.g. due to higher activity of phone, the battery capacity decreases. The battery capacity is a measure of the amount of charge the battery holds and is usually calculated in mA.h. Also, environment tem-perature affects not only the battery capacity but the voltage delivered to the phone as well. In addition, the battery capacity decreases with every discharge-recharge cycle. Modern smartphones use aggressive battery management policies and many of these policies are based on battery readings. The device can force some processes to stop their operation, put the phone in sleep mode, or change the power state of different components in an attempt to sustain the battery for longer periods. When performing multiple runs of the same experiment, the battery state is rarely the same across these runs. This fact means that the phone behavior is not consistent across the

(42)

experimental runs, which in return means that there is another uncontrolled variable to be accounted for in the analysis of power consumption. Because battery discharge characteristics are used by the mobile phone to make many of the power management decision, it is best to bypass the battery completely to minimize the variability in the phone behavior.

For the purposes of this research work, the battery of the mobile phone is bypassed completely. The required current and voltage are directly supplied by the Monsoon Power Monitor. This setup helps mitigating the influence of the power management policies on the experimental results. By having a constant supply of power to the phone, the power management policies interference due to battery discharge charac-teristics is minimized.

2.6

Summary

This chapter covered relevant work and tools in the space of power and energy profil-ing. Due to the large contribution to and complexity of the energy profiling research space, the chapter focused on aspects specifically pertaining to the scope of this re-search work. The chapter covered the concept of energy hogs and energy bugs, looked at some state-of-the-art energy profiling tools, illustrated some known energy profiling methodologies, and outlined the issues arising from energy profiling in the presence of mobile batteries. The next chapter details the experimental design for collecting the required time series power recordings.

(43)

Chapter 3

Experimental Design

This research work aims to determine the impact of the Tutela Network Assessment Tool (NAT) experience app on the user experience through quantitatively assessing the energy consumption of the app features. This chapter describes the design of the experiment that helps us collect the required data to achieve the objectives of the research. First, this chapter presents the application under testing by illustrating the functionality of the app and features to be tested. It then describes the con-trolled environment that the phones will be tested under. The tests are conducted in collaboration with a third party, Wavefront Inc., that has extensive experience in commercial mobile devices performance measurements. In addition, the chapter details the experimental plan showing the different types of tests performed and the rationale behind these tests. This chapter is concluded with the method of selecting the mobile devices on which to test the Tutela app.

3.1

Application Under Test

Tutela Technologies Inc. has developed an Android application called Network As-sessment Tool or NAT, that helps wireless service providers and app developers un-derstand the end-user experience. This application is designed to perform periodic network communication tests and compute analytical descriptors to be used by wire-less carriers and app developers. This section presents, at a high-level, the NAT app which is the object of the experiment. The main features of the app, the database logging server, and the different modes of the app are discussed in this section.

(44)

3.1.1

Application Features

The NAT app performs two main periodic tests to collect the required user experi-ence data: a Quality-of-Service (QoS) test and a Throughput (TP) test. The QoS test consists of sending a series of UDP packets with serially numbered payloads to a remote server. Using the server responses to the UDP packets, the application measures performance values such as delay and packet loss. The QoS test is peri-odic with a default period of 30 seconds, which is the shortest allowable period for this test. Also, by default, this test is performed only over Wi-Fi connections unless specifically configured otherwise. For the purposes of these experiments, the default period is used. Also, the tests are performed with the phone connected to Wi-Fi as explained in Section 3.2.

The Throughput test consumes more resources than the QoS test and involves downloading a 1MB file from and uploading a 500KB file to a cloud infrastruc-ture. This test aims to determine the goodput, which is defined as application-level throughput [39], i.e. the gross bit rate transferred physically. In other words, the test determines the amount of useful data delivered to the target destination, excluding protocol overhead and retransmitted packets, in units of data amount per total de-livery time. The closer the goodput value is to the throughput value, the better the network performance is, and the better the user experience is as a consequence. The NAT TP test is programmed to only be run over Wi-Fi connections. The default period of TP test is 120 seconds, which is the shortest allowable period by NAT expe-rience app. For the purposes of these experiments, the default period of 120 seconds is used. NAT calculates the location of the mobile device as part of the collected ana-lytics using GPS or Network Field calculations. GPS can be enabled to determine the location or the default network field mechanism can be used. The details of how each approach works is out of the scope of this thesis. The NAT app also provides an op-tion for anonymizing the data recordings making all the logged network performance data points anonymous before transmission over the network. For the purposes of this research work, the configuration for anonymizing data recordings is turned on at all times. However, both location determining mechanisms, GPS and network field, are tested for comparison. The rest of the testing configuration is detailed in later sections of this chapter.

Referenties

GERELATEERDE DOCUMENTEN

In het uiterste geval, wanneer er geen vervanging te krijgen is (vervangers zijn schaars) en het intern of anderszins niet mogelijk blijkt om iemand voor de klas te krijgen zullen

• Printerlettertypen gebruiken: als deze optie is ingeschakeld, gebruikt de printer bij het afdrukken van uw document de lettertypen die in het geheugen zijn opgeslagen en worden

VANDEN BORRE BOORTMEERBEEK BOORTMEERBEEK MEDIA MARKT BRAINE L'ALLEUD BRAINE L'ALLEUD.

Resolutie camera achterzijde (numeriek): 13 MP, Type camera achterzijde: Enkele camera.. SIM-kaart-capaciteit:

Indien SAMSUNG, naar eigen inzicht, bepaalt dat het Product materiële defecten of productiefouten heeft en bij normaal gebruik in wezen niet voldoet aan de

Gebruik de platte kant van een spudger of uw nagel om de microSD-kaart iets dieper op in de sleuf totdat u een klik

De Actie houdt in dat de consument op basis van een cashback na  aankoop van een bepaald product recht heeft op een bepaalde  vergoeding, zoals aangeven op de websites van Coolblue

Carrefour Planet Sint-Agatha-Berchem Sint-Agatha-Berchem Media Markt Brussel Basilix Sint-Agatha-Berchem Vanden Borre Sint-Agatha-Berchem Sint-Agatha-Berchem.