• No results found

On-body sensing solutions for automatic dietary monitoring

N/A
N/A
Protected

Academic year: 2021

Share "On-body sensing solutions for automatic dietary monitoring"

Copied!
10
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

On-body sensing solutions for automatic dietary monitoring

Citation for published version (APA):

Amft, O. D., & Tröster, G. (2009). On-body sensing solutions for automatic dietary monitoring. IEEE Pervasive Computing, 8(2), 62-70. https://doi.org/10.1109/MPRV.2009.32

DOI:

10.1109/MPRV.2009.32 Document status and date: Published: 01/01/2009 Document Version:

Publisher’s PDF, also known as Version of Record (includes final page, issue and volume numbers) Please check the document version of this publication:

• A submitted manuscript is the version of the article upon submission and before peer-review. There can be important differences between the submitted version and the official published version of record. People interested in the research are advised to contact the author for the final version of the publication, or visit the DOI to the publisher's website.

• The final author version and the galley proof are versions of the publication after peer review.

• The final published version features the final layout of the paper including the volume, issue and page numbers.

Link to publication

General rights

Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain

• You may freely distribute the URL identifying the publication in the public portal.

If the publication is distributed under the terms of Article 25fa of the Dutch Copyright Act, indicated by the “Taverne” license above, please follow below link for the End User Agreement:

www.tue.nl/taverne

Take down policy

If you believe that this document breaches copyright please contact us at: openaccess@tue.nl

providing details and we will investigate your claim.

(2)

W e a r a b l e C o m p u t i n g a n d H e a l t H C a r e

M

aintaining a balance be ­

tween ingested energy through food consumption and expended energy in daily life is key to a person’s long­term health. However, as the pandemic of overweight and obese individuals attests, this balance is challenging to maintain. According to World Health Organization estimates, more than one billion adults worldwide were over­ weight and 400 million were obese in 2005 (www.who.int/topics/obesity/en/). By 2015, WHO predicts the number of obese people alone will increase to more than 700 million.

To support active weight control, various public and private organizations have established weight and diet management programs. Such programs coach individuals to improve eating behav­ iors using daily or weekly status feedback, meal suggestions, and behavior recommenda­ tions. However, as Rena Wing and Suzanne Phelan found, only 20 percent of the people who achieve at least a 10 percent reduction in body weight can maintain that new weight for one year.1 They therefore concluded that,

to increase coaching program success, par­ ticipants need two to five years of continuous support. In addition, further eating behavior

research is needed to advance our understand­ ing of human drive and restraint regarding food intake. The main limitation of current programs aimed at achieving this is a practical one: Participants must complete detailed self­ reports on their eating behavior, while also maintaining their changed lifestyles and eat­ ing behavior on a day­to­day basis.

Along with a personal profile, self­reports are currently the sole source of information for adapting and personalizing feedback and recommendations for coaching program par­ ticipants. Unfortunately, self­reports have a high bias and are hard to maintain. A key shortcoming is that individual respondents can differ dramatically in their motivation to com­ plete questionnaires, their awareness of food intake (particularly with snacks), their literacy level, and their memory and perception capa­ bilities.2 Dale Schoeller, for example, found

that self­reporting estimations ranged from 50 percent under to 50 percent over the actual in­ take amount.3

To address these issues, automatic dietary monitoring (ADM) aims to replace manual eating­behavior reporting with a sensor­based estimation. Here, we discuss requirements and options for on­body sensing of eating behav­ ior, and demonstrate that such sensor informa­ tion can resemble some self­report informa­ tion. These initial ADM research prototypes aren’t yet comfortable enough for continuous

Various on-body sensors can gather vital infor mation about an

individual’s food intake. Such data can both help weight-loss

professionals personalize programs for clients and inform nutrition

research on eating behaviors.

on-body Sensing

Solutions for automatic

dietary monitoring

Oliver Amft

TU Eindhoven

Gerhard Tröster

(3)

AprIl–JunE 2009 PERVASIV E computing 63

use over months or years. However, they do highlight crucial benefits of on­body sensing, the ADM concept for eating behavior research, and fu­ ture directions for diet coaching.

toward adm—

How Will it Help?

As the sidebar, “Diet Monitoring Ap­ proaches,” describes, researchers have proposed a broad range of alternate self­

reporting solutions, including recent attempts to automate eating behavior recognition using ubiquitous sensors. As we see it, sensor­based automatic monitoring will both release individu­ als from the stringent demands of man­ ual reporting and provide researchers with more robust eating behavior infor­ mation. Hence, ADM will simplify the long­term coaching programs on eating behavior that are urgently needed and

infeasible with current, manual moni­ toring techniques.

To replace manual logging, ADM systems supply information on eating behavior, as self­reports conceptually intend. For every consumed food, this information—the dimensions of eating behavior—includes

intake timing (daily schedule), •

food type or category, •

C

lassic dietary monitoring techniques require users to manually record their eating behavior. Among these as-sessments, self-reports are intended to capture every food intake, on a day-to-day basis, as required by weight and diet manage-ment programs. However, low adherence and accuracy restrict the reports’ validity and thus the benefit of coaching programs that use them.1

researchers have made multiple attempts to simplify tedious and error-prone logging. However, studies have confirmed that replacing paper-based reports with manually operated elec-tronic devices doesn’t reduce reporting errors.2

Manual Methods

There are several alternate manual methods for capturing eat-ing behavior information. Jennifer Mankoff and her colleagues scanned shopping receipts to simplify diet monitoring.3 My-Foodphone nutrition, Inc. (www.myfoodphone.com) intro-duced a commercial service that assesses food intake based on users’ mobile-phone pictures. Katie Siek and her colleagues used bar codes and voice recordings to replace self-report questionnaires.4

Automated Solutions

With manual dietary monitoring, participants are asked to record their own eating behavior. In contrast, automatic dietary moni-toring aims to estimate eating behavior without the person’s active participation. We can categorize these automated tech-niques according to their sensing approach: ambient-embed-ded, on-body, or implantable.

researchers have developed a few pioneering solutions that use ambient-embedded sensors. Keng-Hao Chang and his col-leagues developed a dining table that detected the weight of foods and identified food bowls from radio-frequency identifica-tion tags.5 Jiang Gao and colleagues used surveillance video to identify arm movements to the mouth.6 In their general evalu-ation of rFID for home monitoring, Donald patterson and his

colleagues estimated morning activities, including breakfast consumption timing.7

Implantable solutions, such as in-oral sensing,8 could pro-vide more precise information on the eating process. However, this solution is technically challenging and alters oral sensation. Hence, it appears infeasible for long-term diet monitoring.

REfEREnCES

1. l.E. Burke et al., “using Instrumented paper Diaries to Document Self-Monitoring patterns in Weight loss,” Contemporary Clinical

Tri-als, vol. 29, no. 2, 2008, pp. 182–193.

2. B.A. Yon et al., “The use of a personal Digital Assistant for Dietary Self-Monitoring Does not Improve the Validity of Self-reports of Energy Intake,” J. American Dietetic Assoc., vol. 106, no. 8, 2006, pp. 1256–1259.

3. J. Mankoff et al., “using low-Cost Sensing to Support nutritional Awareness,” Proc. 4th Int’l Conf. Ubiquitous Comp., lnCS 2498,

Springer Verlag, 2002, pp. 371–376.

4. K.A. Siek et al., “When Do We Eat? An Evaluation of Food Items Input into an Electronic Food Monitoring Application,” Proc. 1st Int’l Conf.

Pervasive Comp. Technologies for Healthcare, IEEE CS press, 2006, pp.

1–10.

5. K.H. Chang et al., “The Diet-Aware Dining Table: Observing Dietary Behaviors over a Tabletop Surface,” Proc. 4th Int’l Conf. Pervasive

Comp., lnCS 3968, Springer Verlag, 2006, pp. 366–382.

6. J. Gao et al., “Dining Activity Analysis using a Hidden Markov Model,” Proc. 17th Int’l Conf. Pattern Recognition, vol. 2, IEEE CS press, 2004, pp. 915–918.

7. D.J. patterson et al., “Fine-Grained Activity recognition by Aggregat-ing Abstract Object usage,” Proc. 9th IEEE Int’l Symp. Wearable

Com-puters, IEEE CS press, 2005, pp. pages 44–51.

8. E. Stellar and E.E. Shrager, “Chews and Swallows and the Microstruc-ture of Eating,” The American Journal of Clinical Nutrition, vol. 42, 1985, pp. 973–982.

(4)

Wearable Computing and HealtHCare

food amount, and •

energy content (calories). •

The AMD systems must also meet op­ erational requirements, prove robust, and offer comfort suitable for long­ term use.

Challenges for adm

For self­reports and ADM solutions, the challenge is to capture both the di­ versity of consumed foods and the vari­ ability in personal eating behaviors. For example, energy intake is most accu­ rately determined by reporting the con­ sumed food’s calories. However, even with direct calorie reporting, energy estimation requires additional infor­ mation, including the amount of con­ sumed food and whether the person has altered it (such as by adding a dressing or sauce). Furthermore, calorie report­ ing is often complex and infeasible for homemade meals.

People have preferences about their

food choices and categories, and their meal schedules. ADM solutions can integrate these preferences as prior in­ formation to estimate eating behavior. Still, a person’s actual eating behavior is influenced by varying environmen­ tal and psychological aspects, includ­ ing constraints in food availability, social interaction during meals, and emotions.

A particular challenge for ADM solutions is to robustly recognize eat­ ing behavior from the sensor data. No single sensor—independent of its lo­ cation and recorded physiological or activity information—can capture all dimensions of eating behavior. The re­ strictions of initial ADM approaches reflect this challenge. Typically, the solutions emphasize particular di­ mensions of eating behavior, such as recording consumed food amounts using a weight scale, while restrict­ ing location to the weighting­enabled table. Moreover, solutions that rely

exclusively on ambient­embedded sen­ sors increase the challenge of robustly assigning measurements to one per­ son. Although these works represent relevant advancements toward ADM, a multimodal sensing approach will better support monitoring of several eating behavior dimensions.

benefits of on-body Sensing Monitoring eating behavior in a con­ tinuous and location­independent way is a vital ADM system property, as modern lifestyles imply location changes for both work and leisure purposes. Consequently, people con­ sume food in various locations and in transit. Solutions that depend on a particular environment—such as the home—would miss snacks, let alone entire business lunches. Such partial coverage severely limits the effect of behavior coaching and could produce misleading recommendations.

Coaching and research­oriented be­ havioral understanding require con­ tinuous diet monitoring that covers all situations. On­body sensors can con­ tinuously monitor eating behavior, in­ dependent of dedicated sensor­enabled environments. Also, in contrast to ambient­embedded sensors, on­body sensors directly associate recorded in­ formation with the wearer.

evaluation of on-body

Sensing Solutions

We analyzed on­body sensing ap­ proaches and modalities to evaluate the benefits for ADM. As Figure 1 shows, our analysis covered activities related to eating and physiological responses to food consumption. We analyzed which eating behavior dimensions a particular solution helps to estimate, as well as its limitations. We also assessed how com­ fortable it is to wear.

Swallowing

Swallowing reflex initiated during food intake.

Chewing

Chewing strokes food breakdown sounds during

food intake.

Cardiac responses

Heart rate and blood pressure change related

to food intake.

Gastric activity

Stomach activity and bowel sounds related

to food intake.

Intake gestures

Intensional arm movements to bring food into mouth.

Thermic effect

Temperature increase after food intake at liver region.

Body composition

Body composition changes related to food intake.

Body weight

Immediate body weight increase after food intake.

Figure 1. Major on-body sensing solution for food intake. We used intake gestures, chewing, and swallowing activities to estimate food intake cycles.

(5)

AprIl–JunE 2009 PERVASIV E computing 65

assessment Criteria

As we noted earlier, estimating energy intake requires the food’s category and amount, combined with a more com­ plex inference. We therefore didn’t in­ clude energy intake in this investigation. Table 1 summarizes our evaluation re­ sults for all sensing solutions on eating behavior dimensions, limitations, and comfort.

We selected and analyzed three ba­ sic eating activities: intake gestures, chewing, and swallowing. These activities represent a temporal descrip­ tion of food intake and help identify intake cycles. We developed sensing prototypes for these activities and analyzed the effectiveness of these so­ lutions for predicting food category and amount in user studies. To obtain individual performance estimates, we used a Naïve Bayes classifier preceded by linear discriminate feature extrac­ tion. Finally, to ensure the results’ ro­ bustness, we deployed a five­fold cross­ validation.

intake gestures

Most food intake requires upper body movements (arms and trunk). We dis­ tinguished these movements into coarse food and beverage preparation—such as unpacking, cooking, and plate load­ ing—and actual food intake. The lat­ ter includes motions to fine­cut and maneuver the food piece to the mouth. In the intake phase, people use tools, such as forks and knives. We focused our recognition approach on inten­ tional arm movements for the intake, which we refer to as “intake gestures.” Because these intake gestures reflect intake types (eating or drinking) and food category (based on tools used), they provide timing and food category information.

We record intake gestures using in­

ertial sensors at the participant’s wrists and upper back. As Figure 2a shows, we derived a comfortable recording setup by integrating commercial motion sen­ sors (www.xsens.com) in a jacket. The sensing units contain 3D acceleration, gyroscope, and compass sensors.

To evaluate how the sensors help to discriminate between different gestures, we studied four students eating foods in four different gesture categories4:

eating lasagna with a fork and •

knife,

drinking from a glass, •

eating soup with a spoon, and •

eating bread using only one hand. •

The students ate and drank in ran­ dom order, without particular move­ ment instructions. During recording breaks, they performed other activi­

ties—such as reading a newspaper and making a phone call—to promote nat­ ural movement variability. In total, we recorded 1,020 intake gestures over 4.68 hours.

Using the classification procedure, we obtained 94 percent accuracy over­ all. Figure 2b shows the results for in­ dividual gesture categories. We used only temporal features from arm ac­ celeration sensors. We observed that we can model intake gestures’ tem­ poral structure by computing each gesture instance’s features in four sections. Without these temporal fea­ tures, we achieved similar classifica­ tion results, but had to use all motion sensor modalities and hidden Markov models.4

Although the motion sensor jacket was a useful research prototype, we plan to replace it with less complex

Fork & knife

(eating lasagna) (from a glass)Drink

Intake gestures Spoon

(eating a soup) (eating bread)Hand 100 90 80 70 60 50 40 30 20 10 0 Accuracy (%) (b) (a)

Figure 2. Intake gestures. (a) User wearing the motion sensor jacket during eating. (b) Classification rates for different intake gestures, including inter-person min-max values.

(6)

Wearable Computing and HealtHCare

sensors. The classification using only acceleration shows that we can reduce the number of sensors. That said, our study wearers reported that the jacket was comfortable for sitting activities. Chewing

One option for recording chewing strokes (the jaw opening and closing) is to monitor masseter and temporalis muscle activation using surface elec­ tromyography (EMG). Because mus­ cles are located in exposed facial re­ gions, sensing jaw movement—which is highly variable during chewing and other motions, such as speaking— might require attaching a sensor in ex­ posed facial regions.

To avoid compromising privacy, we found a feasible alternative. Chewing generates sound emissions that conduct through mandible, skull, and body tis­ sue. So, we recorded chewing sounds using an ear­attached microphone. Based on an acoustic profile during chewing, we classified foods5 and ana­

lyzed different microphones and ear­de­ vice cases. Figure 3a shows one device, in which we embedded a miniature microphone in a standard headphone case. In another construction, we used an ear­pad case. With the latter setup,

we studied how users perceived the ear occlusion. Smaller pads reduced ear occlusion and increased user comfort, but also reduced the signal­to­noise ra­ tio. Users found the headphone device convenient; this was especially true of those who were used to wearing similar models with music players.

We studied the scalability of this ap­ proach to classify various foods. To do this, we asked three male students with natural dentition to eat 19 standard foods as they normally would. In sev­ eral sessions, we recorded chewing us­ ing a low­occlusion ear­pad device. In this setup, the wearer could understand office­room conversation within two me­ ters. In all, we obtained approximately 12,000 chewing strokes in five hours of data. For classification of all foods, we obtained accuracy of 80 percent. For the headphone case, this high accuracy dropped by five to 10 percent, depending on ambient noise. As features, we used spectral energy bands and cepstral and linear predictive coefficients.6 We se­

lected these features based on robust re­ sults obtained with earlier recordings.

Figure 3b offers a quick overview of classifier performance for all foods. The color­coding shows classifier con­ fusions (the yellowish colors that fall

outside the main diagonal) of various acoustic groups among foods. Sound patterns are primarily controlled by food texture—and, thus, lettuce is partly confounded with carrots and apples.

In this evaluation, food texture was our main selection criteria. The set in­ cludes similar textures, such as lettuce and apples, and covers a broad variety of materials and preparation styles, such as for cooked meat. Because our result demonstrates texture­based dis­ crimination capabilities, we further deploy chewing sound recognition for nutritional­relevant food groups from the food pyramid. For example, we can group fruits and vegetables based on a similar “wet­crisp” texture and recognize this group in continuous sound data.6

Swallowing

Swallowing, which happens uncon­ sciously throughout the day, occurs with increased frequencies during food intake.7 Specifically, after we chew food

and convert it into a bolus, our tongue movements initiate a reflex of throat muscles that propel the bolus through the throat into the esophagus.

Most swallowing studies analyze ab­

(a) Predicted class

C2 C4 C6 C8 C10 C12 C14 C16 C18 0 0.2 0.4 0.6 0.8 1 Potato chips C1 Apple C2 Lettuce C3 Pasta C4 Bread C5 Carrots C6 French fries C7 Cooked chicken C8 Chocolate C9 Muffin C10 Toast C11 Cooked potato C12 Pepper C13 Maize C14 Orange C15 Waffles C16 Cornbar C17 Biscuit C18 Peanuts C19 Actual class (b)

Figure 3. Chewing sounds. (a) Miniature microphone integrated in headphone case. (b) Color-coded classifier confusion for the chewing of 19 foods. This classification confirms texture-related sound patterns in foods.

(7)

AprIl–JunE 2009 PERVASIV E computing 67

normal swallowing in laboratory set­ tings. Because tongue and esophageal movements are challenging to monitor with on­body sensors, we focused on the swallowing reflex using sensors at the throat. To investigate different sens­ ing modalities, we developed a set of collars.

As Figure 4a shows, in one collar sys­ tem, we monitored textile elongation to detect skin movement during swallow­ ing. Such elongations occur mainly for male subjects, since females have a less prominent Adam’s apple. Moreover, the strain­sensing collar required accurate positioning, and signals were impaired when the neck and collar moved.

In a second solution, we combined surface EMG and a stethoscope­like microphone to monitor both throat­ muscle contraction in deep tissue lay­ ers and swallowing sounds (see Figure 4b. While EMG is impaired by other throat­muscle activations, the swal­ lowing sound pattern is influenced by food viscosity. We combined both modalities to determine the amount of swallowed food.

We used these sensors with five stu­ dents eating foods and drinking water as they normally would. Over several sessions, we analyzed a total of 4.85 hours of data and 868 swallows.8 We

discriminated between two types of swallowing: low volume (such as 5 mil­ liliters of water, a spoonful of yogurt, and 2 cubic centimeter pieces of bread) and large volume (15 ml water) with an overall accuracy of 73 percent. As with chewing sound classification, swallow­ ing volume discrimination required a spectral feature set.8

As expected, users found both collars uncomfortable for long­term monitor­ ing. Our current work aims to replace the collar prototypes with more conve­

nient systems, such as embedding sen­ sors in a shirt collar.

Further on-body

Sensing options

We analyzed whether other sensing so­ lutions could provide eating behavior information. Our goal was to review activities and physiological responses closely related to food intake and sum­ marize potential benefits for ADM. gastric activity

Swallowed food arrives at the stomach after roughly 15 minutes. It’s subse­ quently decomposed by stomach muscle contractions. Further digestion in the gastrointestinal tract incurs time delays in the range of hours with respect to the originating intake and thus is far less deterministic.

There are few on­body sensing op­ tions for late digestion stages. Research­ ers have captured stomach muscles’ electric and magnetic fields using labo­ ratory setups, such as electrogastrog­ raphy (EGG).9 However, EGG hasn’t

reached broad clinical acceptance. A stethoscope can assess abdominal sounds from food movement in intes­ tines. Although bowel sounds are typi­ cally loudest after fasting, researchers recently confirmed a relation to intake for laboratory settings.10 All such mea­

surements are perturbed by heart and respiration activity, as well as by body movement.

thermic effect of Food intake The thermic effect of food intake (TEF) is a thermogenesis in response to intake above resting metabolic rate. Although

TEF is the smallest component in hu­ man energy expenditure, researchers studied its relation to intake restraint and obesity.

Optimal TEF assessment requires a respiratory chamber to measure changes in resting metabolic rate be­ fore and after intake. TEF starts imme­ diately after food reaches the stomach and peaks after roughly 60 minutes. For unrestrained eating in people of normal weight, skin temperature above the liver increased between 0.8 and 1.5K.11 TEF

depends on regularity of intake and is lower for irregular intake.12

body Weight

Food intake is associated with an im­ mediate gain in body weight. If weight is monitored, we can determine intake timing and food amount. Typical meals range from 50 grams for light meals to 500 grams or more for multiple­course menus. Snacks can weigh just a few grams, but still contribute an impor­ tant share to daily intake, as they often include high­calorie foods or sweets.

In contrast to classic body weighting (such as weekly measurements), intake­ related weight changes require continu­ ous day­long weighting. Load sensors in shoes would ideally serve this pur­ pose. Compared to a scale, shoe­based weighting requires a low mechanical profile, high torsion flexibility, and low system weight. Also, the system must measure weight from foot force dis­ tribution in the (sometimes brief) mo­ ments when the user is standing still. These requirements are not easily met. Classic load cells don’t fulfill the me­ chanical constraints. Pressure­sensing Figure 4. Swallowing. Collar-based

prototypes with (a) carbon-loaded rubber elongation sensors, and (b) integrated surface EMG and microphone.

(8)

Wearable Computing and HealtHCare

TABLE 1

Assessing on-body sensing solutions for automatic dietary monitoring (ADM).

Sensing solution Dimensions of eating behavior Modalities and comfort for everyday use

Intake gestures Timing: continuous recognition of four gesture types

yielded r: 79 percent, p: 73 percent.4

food type: movement related to food category;

recognizes four types at C: 94 percent.

food amount: open

Limit: errors occur for arbitrary arm movements to

the head and unusually long gestures.

Modalities: inertial sensors at lower arms

and upper back.

Comfort: conveniently integrated into smart

clothing or accessories, such as a watch or bracelet.

Chewing Timing: continuous recognition for two food categories

yielded r: 93 percent, p: 52 percent.6

food type: ear-pad device recognizes 19 foods using

chewing strokes. C: 80 percent.

food amount: open

Limit: perturbed by ambient noise and low ear occlusion.

Modalities: ear-pad microphone,5 similar to ear-attached hearing-aid devices.

Comfort: depends on ear occlusion;

convenient for headphone case.

Swallowing Timing: continuous recognition of four bolus types

yielded r: 65 percent, p: 31 percent.8 food type: open

food amount: bolus volume; recognizes low vs. high

volume yielded C: 73 percent.8

Limit: Individual modalities impaired by head and neck

movements, chewing, and speaking.

Modalities: surface electromyography,

stethoscope microphone, or similar acoustic transducer8; skin movement at the throat (as in our studies); throat impedance or capacitive sensing.

Comfort: large size sensor-collars are

uncomfortable; improvements expected with collar-shirt implementations. Gastric activity Timing: stomach activity increases roughly 15 minutes

after intake9; duration dependencies unclear. food type & amount: relations unclear.

Limit: approaches require strict laboratory settings;

infeasible for nonstationary monitoring.

Modalities: electrogastrography,9 impedance gastrography, and bowel sounds.10

Comfort: electrodes/sensors must be tightly

attached to chest or belly. Thermic effect Timing: temperature increase of 0.8 to 1.5 Kelvin roughly

60 minutes after intake;11 duration dependencies unclear. food type & amount: relations unclear.

Limit: temperature depends on regularity of food intake.12 unrestricted physical activity and ambient temperature alter this relationship.

Modalities: temperature sensor.

Comfort: must be attached to skin in

proxim-ity of the liver.

Body weight Timing: body weight increases immediately; intake

duration isn’t assessable.

food type: n/a

food amount: required weight monitoring resolution

is <50 grams for meals and 5 grams for snacks.

Limit: shoe-based weight measurement requires users

to stand still and is impaired by uneven floor surfaces.

Modalities: shoe-embedded weight or force

sensor array (unsolved); current in-shoe force sensors don’t provide appropriate resolution.13 Comfort: related to shoe torsion flexibility

and weight.

Cardiac responses Timing: heart rate increases roughly 30 minutes after

intake14 for up to 3 hours (laboratory).

food type & amount: relation to heart rate is unclear;

blood pressure is influenced by salt and sugar.

Limit: relationships altered by physical activity, fasting

time, and time of day. Measurements perturbed by physical activity.

Modalities: for heartrate, electrocardiogram

chest strap or close-fitting shirt; for blood pressure, cuff-based or cuffless monitor. research on cuffless approaches is ongoing.

Comfort: cuff-based monitor impractical

for long-term use. Body composition Timing: body impedance altered roughly 30 minutes

after intake in clinical settings;15 duration unknown. food type & amount: relations unclear.

Limit: measurements perturbed by body movement.

Modalities: body impedance using electrodes. Comfort: hand-to-foot electrodes are

potentially inconvenient.

(9)

AprIl–JunE 2009 PERVASIV E computing 69

arrays struggle to meet weight resolu­ tion requirements. Capacitive in­shoe gait measurement systems have an er­ ror of 2.7 percent,13 corresponding to

1,890 grams for a 70­kilogram person. We studied arrays of force­sensitive re­ sistors and observed even larger errors due to signal noise and shoe torsion. At this point, a continuous wearable measurement of body weight remains unsolved.

Cardiac responses

After meal intake, blood is redistrib­ uted to the stomach and lower gastro­ intestinal tract, which increases heart rate 30 minutes after intake.14

Blood pressure is dependent on food composition, especially on salt and sugar. Classic blood pressure measure­ ments require cuff­based solutions and are inconvenient for daily use. How­ ever, ongoing research is investigating novel cuffless approaches, such as those based on pulse arrival time. Cardiac re­ sponses depend on various aspects, in­ cluding physical activity, body posture, fasting time, and time of day.

body Composition

Food intake immediately modifies body composition. In a laboratory set­ ting, we measure body impedance be­ tween the hand and foot; studies show that composition is altered 30 minutes after intake.15 The effect depends on

both gender and food type, and fur­ ther investigations are needed to study composition assessment validity. In any case, movement artifacts make the ef­ fect impractical for ADM systems.

intake Cycle modeling

Intake gestures, chewing, and swallow­ ing represent a temporal description of food intake. As Figure 5a shows, we selected these solutions to construct a hierarchical recognition procedure to identify intake cycles. In our approach, an intake cycle stretches from an intake gesture (taking a bite of food) until the bite is completely swallowed. We de­ ployed individual detectors to recog­ nize activity events from each sensing solution.

Figure 5b illustrates two event se­ quences—the intake cycles for drink­ ing and for eating. To recognize intake cycles from activity events, we imple­ mented a probabilistic context­free grammar parser.16 The PCFG esti­

mates the fit of event sequences to an intake grammar. We derived grammars for particular food categories, such as drinking and eating fruits. PCFGs let us model recursive event structures, such as the recursion of chewing and swallowing events for eating an apple (Figure 5b).

Our approach provides a num­ ber of benefits for estimating eating behavior:

The temporal fusion of individual •

food category estimations from in­ take gestures and chewing lets us rec­ ognize more diverse categories. The fusion complements individual •

sensing solutions’ estimation errors. At the event level, hierarchical recog­ •

nition allows simplified synchroniza­ tion of sensing solutions with differ­ ent sampling rates.

Because ADM aims to replace manual monitoring for weight and diet coach­ ing, we can use the manual method’s eating behavior information require­ ments and benchmarks for ADM solu­ tions. In our evaluations, we observed that recognizing intake activities from on­body sensors provides information on intake timing, food category, and amount. Moreover, by using on­body sensors, information is obtained con­ tinuously, independent from particular locations. Nevertheless, many current on­body sensing solutions have limi­ tations regarding data artifacts and wearer comfort.

A

lthough combining selected solutions in a hierarchical recognition can compensate for individual sensors’ esti­ mation errors, it refines estimations for food categories only. In comparison to self­reports—which could capture in­

Information fusion Intake gestures Chewing Swallowing Swallowing detection Chew stroke detection Gesture detection Sensing

solutions detectionEvent Events

S W W W W W W W W Drink S Swallowing detection Chew stroke detection Gesture detection

Drink: Moving glass to mouth and back Hand: Moving apple to mouth and back W: wet-crisp chewing strokes S: swallowing event

S

Time

Hand

Drinking Eating one bite of apple Event detection map

(b) (a)

Intake cycle recognition

Figure 5. Intake cycle recognition approach. (a) Hierarchical recognition procedure, such as for food category estimation. (b) The intake event sequences for drinking and eating one bite of apple.

(10)

Wearable Computing and HealtHCare

formation on exact food type—this is a limitation. Similar restrictions apply for food amount (and hence energy in­ take estimation). While energy intake is important, food also provides essen­ tial nutrients, and individual nutrient requirements can vary widely. More­ over, eating disorders—such as binge eating—indicate that eating is tightly coupled to momentary psychological state and emotions. Self­reports could ask specific questions to capture this day­to­day variation. However, if we consider self­reporting’s practical is­ sues and biases, even the categorical information obtained with ADM is highly beneficial. We expect that ini­ tially deployed systems will track a few food categories, such as fruits and veg­ etables, related to particular nutritional recommendations. Our studies showed high recognition performances for iden­ tifying these categories.

Among all selected sensing solu­ tions, the least comfortable are the swallowing solutions. In our on­going research, we plan to replace the cur­ rent collar prototypes with more con­ venient systems. In addition to the diet coaching domain, we plan to deploy ADM­sensing solutions in basic re­ search to advance the understanding of eating behavior. Finally, we plan to combine on­body and ambient sensing solutions to leverage the advantages of both approaches.

ACknoWLEDGMEnTS

We are grateful to all project participants for com-mitting time to our various studies. The Swiss State Secretariat for Education and research and the European union’s MyHeart (IST-2002-507816) project partially supported this work.

REFEREnCES

1. R.R. Wing and S. Phelan, “Long­Term Weight Loss Maintenance,” American

J. Clinical Nutrition, vol. 82, 2005, pp.

222S–225S.

2. J.C. Witschi, “Short­Term Dietary Recall and Recording Methods,” Nutritional

Epidemiology, vol. 4, Oxford Univ. Press,

1990, pp. 52–68.

3. D.A. Schoeller, “Limitations in the Assessment of Dietary Energy In take by Self­Report,” Metabolism: Clinical and

Experimental, vol. 44, no. 2, 1995, pp.

18–22.

4. H. Junker et al., “Gesture Spotting with Body­Worn Inertial Sensors to Detect User Activities,” Pattern Recognition, vol. 41, no. 6, 2008, pp. 2010–2024.

5. O. Amft et al., “Analysis of Chewing Sounds for Dietary Monitoring,” Proc.

7th Int’l Conf. Ubiquitous Comp., LNCS

3660, Springer Verlag, 2005, pp. 56–72. 6. O. Amft and G. Tröster, “Recognition of

Dietary Activity Events Using On­Body Sensors,” Artificial Intelligence in

Medi-cine, vol. 42, no. 2, 2008, pp. 121– 136.

7. C.S. Lear, J.B. Flanagan, and C.F. Moor­ rees, “The Frequency Of Deglutition In Man,” Archives of Oral Biology, vol. 10, 1965, pp. 83–100.

8. O. Amft and G. Tröster, “Methods for Detection and Classifica tion of Normal Swallowing from Muscle Activation and Sound,” Proc. 1st Int’l Conf. Pervasive

Comp. Technologies for Healthcare,

IEEE CS Press, 2006, pp. 1–10. 9. T.L. Abell and J.R. Malagelada, “Elec­

trogastrography,” Digestive Diseases

and Sciences, vol. 33, no. 8, 1988, pp.

982–992.

10. K. Yamaguchi et al., “Evaluation of Gas­ trointestinal Motility by Computerized Analysis of Abdominal Auscultation Find­ ings,” J. Gastroenterology and

Hepatol-ogy, vol. 21, no. 3, 2006, pp. 510–514.

11. M.S. Westerterp­Plantenga, L. Wouters, and F. ten Hoor, “Deceleration in Cumu­ lative Food Intake Curves, Changes in Body Temperature and Diet­Induced Thermogenesis,” Physiology & Behavior, vol. 48, no. 6, 1990, pp. 831–836. 12. H.R. Farshchi, M.A. Taylor, and I.A.

Macdonald, “Decreased Thermic Effect of Food after an Irregular Compared with a Regular Meal Pattern in Healthy Lean Women,” Int’l J. Obesity and Related

Metabolic Disorders, vol. 28, no. 5, 2004,

pp. 653–660.

13. H. Hsiao, J. Guan, and M. Weatherly, “Accuracy and Precision of Two In­Shoe Pressure Measurement Systems,”

Ergo-nomics, vol. 45, no. 8, 2002, pp. 537–

555.

14. D.R. Parker et al., “Postprandial Mesen­ teric Blood Flow in Humans: Relationship to Endogenous Gastrointestinal Hormone Secretion and Energy Content of Food,”

Euro. J. Gastroenterology and Hepatol-ogy, vol. 7, no. 5, 1995, pp. 435–440.

15. E. Gualdi­Russo and S. Toselli, “Influence of Various Factors on the Measurement of Multifrequency Bioimpedance,” Homo, vol. 53, no. 1, 2002, pp. 1–16.

16. O. Amft, M. Kusserow, and G. Tröster, “Probabilistic Parsing of Dietary Activity Events,” Proc. Int’l Workshop Wearable

and Implantable Body Sensor Networks,

vol. 13, Springer Verlag, 2007, pp. 242– 247.

For more information on this or any other com-puting topic, please visit our Digital Library at www.computer.org/csdl.

the

Authors

Oliver Amft is an assistant professor at Tu Eindhoven university of

Technol-ogy, The netherlands, and a senior research advisor at ETH Zurich’s Wearable Computing lab. His research interests include ubiquitous sensing, embedded systems, and pattern recognition for activity, behavior, and context awareness. Amft has an MSc from Chemnitz university of Technology, and a phD from ETH Zurich. He’s a member of the IEEE. Contact him at amft@ieee.org.

Gerhard Tröster is a professor and head of the Wearable Computing lab at

ETH Zurich. His research interests include wearable computing for healthcare and production, smart textiles, sensor networks, and electronic packaging. Tröster has a phD in electrical engineering from the Technical university Darm-stadt. Contact him at troester@ife.ee.ethz.ch.

Referenties

GERELATEERDE DOCUMENTEN

Indeed, dietary intake of saturated fatty acids and trans fats have been associated with coronary heart disease.. Replacing saturated fatty acids with unsaturated fatty acids gave

Objectives: (1) Study food intake and weight status of both high and low SES and relate this to stages in the nutrition transition; (2) to get insight in determinants of differences

In de integrale intake zal deze keuze aan bod komen; de gemeente zal daarover adviseren en neemt hiermee een deel van de regie weer in handen, aldus Fian van Vlokhoven: “Omdat

Op deze wijze onderhouden we samen een lerend netwerk rond DichtErBij, waarin we steeds meer van elkaar leren.. Ook professionals die niet in de pilotteams zitten, worden van

Nee, indien niet aangeleverd binnen 14 dagen dan wordt de aanvraag buitenbehandeling gesteld. In dat geval is de aanvrager het aanvraagtarief kwijt, maar wordt het tarief voor

Gaat het heel goed met de taal en het werk, dan gaan we onderzoeken hoe u een hoger taalniveau, passende opleiding of (betaald) werk kan krijgen.. Doel van de brede intake en

Het is in dit gesprek van belang dat inburgeringsplichtigen worden voorbereid op het proces dat hun te wachten staat; zij moeten hierover duidelijkheid krijgen en begrijpen

In de bestuurlijk overleggen tussen de VNG en SZW is gesproken over het uitstel van de nieuwe wet en de begeleiding van inburgeraars onder het huidige stelsel. De minister van SZW en