• No results found

How language changes perception : the influence of (unconscious) language processing on motion perception

N/A
N/A
Protected

Academic year: 2021

Share "How language changes perception : the influence of (unconscious) language processing on motion perception"

Copied!
23
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

How Language Changes Perception

The Influence of (Unconscious) Language Processing

on Motion Perception

Odile Ridderinkhof

Bachelor thesis Brain & Cognition

Faculty of Social and Behavioural Sciences

University of Amsterdam

Student number: 10006834

Supervisors: Dr. S. van Gaal & J. Francken, PhD.

July 2014

(2)

Contents

Abstract 3

Introduction 4

The Influence of Language on Perception 5 Consciousness and the Narrative Left Hemisphere 8

Aim of the Current Study 11

Method 12

Results 14

Discussion 18

References 21

(3)

Abstract

The traditional Sapir-Whorf hypothesis states that language influences perception. It remains unclear, however, to what extent this is the case, and whether this happens at a conscious or unconscious level. Recent studies suggest that language may influence perception more strongly in the right half the visual field, because language is mainly processed in the left hemisphere. In the current study, the influence of conscious and unconscious language on motion perception is investigated. Subjects were primed with weakly masked and strongly masked motion words before being presented with a cloud of moving dots, of which subjects were to indicate whether they moved upward or downward. As expected, response times were faster and accuracy was higher on trials on which the direction of the motion word was congruent with the direction of movement of the stimulus than when it was incongruent. Subjects were faster when they were primed with strongly masked motion words than when primed with weakly masked motion words. No differences were found, however, between trials in the left and right visual field. Perhaps this is due to attention playing a critical role: in the present study, subjects attended to the motion words, whereas in previous studies, subjects were to ignore the motion words. This relationship between attention and consciousness with regard to perception is to be further investigated.

(4)

Introduction

In daily life, people need to continuously process streams of information. It feels as if all human decisions are made consciously, with all available information taken into account. However, most people are not constantly aware of the chirping of a bird in the tree, the feel of their clothes on their bodies, the passing of hundreds of cars down in the street and the flickering of the light on their laptop indicating that the battery is near-empty. It seems that, luckily, many processes unfold automatically, without reaching conscious awareness. What happens to information that is processed unconsciously?

A striking example of unconscious processing of stimuli is the blindsight phenomenon, discovered by Larry Weizkrantz in the 70’s of the previous century (Ramachandran, 2010). Blindsight can follow from brain damage in the visual cortex, while the eyes remain intact. Patients suffering from blindsight claim to be unable to see. However, when asked to point to a certain stimulus or walk through a corridor with various objects standing in the way, those patients are able to do this surprisingly well, although they think the task is entirely pointless because they report not to see anything. This indicates that, at least to some extent, information is processed outside of conscious awareness and can unconsciously influence behaviour. This is exemplified in a study in which subjects that were unconsciously primed with words about exertion squeezed a hand grip faster and more firmly than control subjects (Aarts, Custers, & Marien, 2008), demonstrating that people can rely on unconscious processes that are able to influence behaviour.

The processing of stimuli goes extremely fast, less than a second is needed to perceive a combination of letters and extract the meaning of a word. This is nicely illustrated in a study in which subjects were presented with masked and unmasked words, with unmasked words being consciously readable (Dehaene, Naccache, Cohen, Le Bihan, Mangin, Poline, & Rivière, 2001). FMRI-analyses revealed that masked words activated areas involved in conscious language processing,

(5)

but did not activate prefrontal and parietal areas, which provides an explanation for subjects being unable to report having seen the word consciously. Even though subjects are unable to report having seen a word consciously, this word may still exert influence on perception, a well-known phenomenon widely used in priming studies. A paradigm used in recent experiments involves the presentation of motion stimuli together with motion words, of which the implied direction can be congruent or incongruent with the direction of movement of the motion stimuli (e.g., Meteyard, Bahrami, & Vigliocco, 2007). The influence of language is indicated by faster response times and higher accuracy when stimuli are presented with congruent motion words than when presented with incongruent motion words (Van Gaal & Lamme, 2011).

It seems that language thus exerts an influence on perception. It remains unclear, however, to what extent this is the case and to whether this process is fully automatic or involves conscious awareness. Firstly, in the following section, the influence of language on perception will be reviewed. As mentioned, the processing of language stimuli goes fast and sometimes out of awareness (Dehaene et al., 2001), although it remains unclear what the exact role of awareness is. Therefore, in the subsequent section, the role of awareness will be discussed. Finally, the results of the current study will be presented and discussed in the light of the present literature.

The Influence of Language on Perception

The traditional Sapir-Whorf hypothesis, also called the linguistic relativity hypothesis, states that perception is influenced by language (De Groot, 2010). Moreover, the deterministic variant of this hypothesis states that the language one speaks not only influences perception, but literally determines perception. This idea comes from the observation that the world is perceivable in numeral ways and that different languages use different ways to describe concepts. This results in

(6)

people’s perception being affected by the language they speak, for they attend to what is made distinctive by their language. An illustrative example De Groot (2010) gives is the fact that Inuit use different words for different kinds of snow: snow that is falling down, snow that has touched ground, muddy snow, and several other types of snow, which are all regarded as different whereas in English all this is simply referred to as ‘snow’. According to the linguistic relativity hypothesis, language thus provides a ‘lens’ through which the world is perceived. In this section, evidence of language influencing perception will be reviewed.

The snow-illustration of linguistic relativity is but one of many examples of how concepts in different languages are mapped onto different linguistic structures. Another example, used in a study by Boutonnet, Dering, Viñas-Guasch, and Thierry (2013), is that speakers of English distinguish between a cup and a mug, whereas in Spanish both are referred to as taza. Spanish and English participants performed a visual oddball task in which they had to press a button when they saw a bowl (5% of times), and do nothing when they saw a standard stimulus (cup, 80% of times) or a deviant stimulus (mug, 15 % of times, or vice versa). Event-related brain potentials (ERP’s) were measured. The results demonstrated that ERP’s in the N1 range of native English speakers distinguished between standard and deviant stimuli, whereas ERP’s of Spanish native speakers did not. For standard stimuli the ERP’s of Spanish and English speakers did not differ. Also the P1 component did not differ between groups, indicating that both groups did perceive differences between the presented pictures of cups and mugs equally. The N1 range, taking place after the P1, is thought to be associated with visual processing perhaps related to object identification. Critically, the N1 occurs before the temporal window in which lexical representations are thought to be accessed. These findings thus suggest that perception is influenced by the language one speaks at a very early stage of processing.

The influence of language on perceptual processing is not restricted to between-language differences and does not solely rely on linguistic knowledge that is already present, but language

(7)

can also influence low-level perceptual processing directly (Meteyard, Bahrami, & Vigliocco, 2007). Subjects performed a motion detection task while listening to verbs describing upward or downward movement, or neutral verbs. Stimulus motion was either upward, downward, or random and thus congruent or incongruent with the verbs presented aurally. Perceptual sensitivity was assessed using signal detection theoretic methods. When subjects heard verbs that were incongruent with the motion signal, perceptual sensitivity decreased in comparison to when they heard verbs that were congruent with the motion signal or neutral verbs. The authors concluded that language can interfere with and thus affect perceptual processing. Lupyan and Ward (2013) took this finding a step further by showing that language not only influences what one sees, but whether or not something can be seen at all. In their study, subjects performed an object detection task while hearing informative or uninformative words. Masked objects were flashed on a screen. Hearing an informative word enhanced the chance of the masked object to come into awareness, whereas this object was invisible without linguistic information or when presented together with irrelevant linguistic information. Relevant language thus enhances detection of otherwise invisible visual information.

The relationship between language and object detection is further studied in an experiment in which participants performed a motion detection task while they were aurally presented with motion words at various time intervals (Pavan, Skujevskis, & Baggio, 2013). Motion stimuli were presented such that they were either at or above the threshold for motion detection. A double dissociation between reaction times and sensitivity was found. For observable motion, a congruency effect for reaction times was found, but not for sensitivity. For non-observable motion, a congruency effect for sensitivity was found, but not for reaction times. Differences in sensitivity were largest when stimuli were presented around 450 ms after word onset. This is explained by the electrophysiological finding that around that time words have been semantically processed and the meaning has become available (Friederici, 2002; in Pavan et al., 2013).

(8)

Taking together the results from previous studies, it seems plausible that language indeed does influence perception, forming evidence for the linguistic relativity hypothesis. An interesting idea recently put forward is that language, because it is processed mainly in the left hemisphere, might affect perception more in the right half of the visual field (Regier & Kay, 2009). Another question that arises is to what extent the influence of language, as it happens fast and at a very early stage of processing (Dehaene et al., 2001; Boutonnet et al., 2013), is moderated by awareness. The next section deals with these two questions.

Consciousness and the Narrative Left Hemisphere

As explained above, according to the linguistic relativity hypothesis, the language one speaks influences perception. Boutonnet et al. (2013) showed that language exerts its influence on perception at an early stage of processing. In English, subcolours of blue are indicated by terms as light blue and dark blue, whereas in Greek these are genuinely separate colours, named ghalazio and ble, respectively. For other colours (e.g., light and dark green), however, such a distinction is not made. Speakers of Greek differ in their perception of blue from speakers of English (Thierry, Athanasopoulos, Wiggett, Dering, & Kuipers, 2009). Native speakers of Greek and English performed an oddball discrimination task while brain potentials were recorded using EEG methods. Subjects had to press a button when a square-shaped stimulus was presented (20% of trials), and do nothing when a circle-shaped stimulus was presented (80% of trials). All stimuli in a block were either blue or green; on 70% of trials in a block stimuli were light and on 30% of trials stimuli were dark, or vice versa, in such a way that 10% of stimuli were circles with a deviant colour. No instructions regarding these colour differences were given. EEG-measures showed that all deviant circles elicited a visual mismatch negativity effect, occurring in the P1 range, in both Greek and

(9)

English native speakers for both blue and green stimuli. Importantly, this effect was greater for Greek than for English participants for blue deviants, whereas the effect was the same for the two groups for green deviants. Visual mismatch negativity is elicited by stimuli that are different from others and is independent from attention (e.g., Berti, 2011), and is therefore regarded automatic. The authors concluded that the larger, preattentively perceived difference for blue deviants is due to the two different terms that are used in Greek, demonstrating a very early influence of language on perceptual processing.

Mo, Xu, Kay, and Tan (2011) further explored the lexical boundary effects on colour perception. They presented subjects with a total of four different shades of blue and green stimuli, all perceptually equidistant from one another, of which the lexical boundaries were determined beforehand. On each trial, subjects were presented with two squares, one in each visual field. Within a block, both squares were of one standard colour (e.g., light green) on 80% of trials. On the other trials, one square was of a within-category deviant colour (e.g., dark green, 10% of trials) or a between-category deviant colour (e.g., light blue, 10% of trials). Subjects performed an unrelated task by pressing the space bar when the standard fixation mark differed (20% of trials), which occurred only on trials on which no deviant colours were presented. Electrophysiological data was recorded using EEG methods. The results revealed that a visual mismatch negativity occurred at 130-190 milliseconds after stimulus onset for all deviant stimuli in the left visual field and for between-category deviant stimuli in the right visual field, but, crucially, not for within-category deviant stimuli in the right visual field. As mentioned earlier, it is hypothesized that language may influence perception more in the right half of the visual field (Regier & Kay, 2009), because language is mainly processed, at least in right-handed individuals, in the left hemisphere. This idea is thus supported by the described findings of Mo et al. (2011), which indicated that there is less of a ‘surprise-effect’ for same-category stimuli presented in the right visual field than for stimuli presented in the left visual field.

(10)

Further evidence for the hypothesis that language influences perception more in the right visual field than in the left comes from a study by Gilbert, Regier, Kay and Ivry (2008), in which subjects were presented with eight pictograms of cats and dogs arranged in a circle, with one pictogram being different than others. The deviant pictogram could be either from the same category or from a different category (e.g., when all pictograms in the circle were cats, the deviant target could be either a different cat or a dog). Reaction times were faster when the target belonged to the different category, and slower when the target belonged to the same category. Moreover, these effects were stronger for stimuli presented in the right visual field than for stimuli presented in the left visual field. These results show that lateralized Whorfian effects also appear for non-colour stimuli, and that there seems to be a general influence of language on perception. This effect seems to be stronger in the right visual field than in the left visual field, presumably because language is processed in the left hemisphere.

In a recent study, Francken, Kok, Hagoort, and De Lange (2014) investigated the influence of motion words on motion perception. Participants respond faster to moving stimuli when these were preceded by congruent motion words than by incongruent motion words. This was indeed the case when stimuli were presented in the right visual field, but not in the left visual field, indicating that the influence of language on perception is larger when stimuli are processed in the language-involved left hemisphere. Analysis using fMRI-techniques revealed that there was increased neuronal activation in the left middle temporal gyrus, an area involved in both semantic processing and contextual integration, when the implied direction of the motion word was congruent with the direction of the motion stimulus.

In conclusion, language seems to influence perception, and evidence exists that this influence is larger in the left hemisphere than in the right hemisphere. It remains unclear, however, to what extent this process is mediated by awareness. Results from previous studies suggest that

(11)

the influence of language occurs at an early stage of processing, but awareness has not been actively manipulated.

Aim of the Current Study

The goal of the present study was to investigate the influence of conscious and unconscious motion-related language on motion perception, and whether this differs for the left and right visual field. This study is based on the study by Francken and colleagues (2014) described above. Participants were presented with clouds of moving dots in either the left or right visual field. Subjects were to indicate whether stimuli moved upward or downward. The visual motion stimuli were preceded by a motion word (for example, rise) flashed briefly in the center of the screen. The motion word was either weakly masked and therefore clearly visible, or strongly masked with the intention to be extremely hard to be consciously read. The direction implied by the motion word was congruent or incongruent with the direction of movement of the motion stimulus, or was neutral, and did not have a predictive relation with the direction of movement of the motion stimulus. Reaction times and accuracy of responses were measured and perceptual sensitivity was calculated, to measure the influence of motion-related language on the perception of motion. A congruency effect is expected, resulting in shorter reaction times and higher accuracy on congruent than on incongruent trials. If there is an effect for visual field, reaction times are expected to be faster and accuracy and perceptual sensitivity to be higher for stimuli presented in the right visual field. The influence of awareness will be explored: perhaps awareness of motion words facilitates responses or, alternatively, inhibits responses. There might also be an interaction between awareness and congruency, which remains to be further explored.

(12)

Method

Participants

Forty-six undergraduate students of the University of Amsterdam participated in the experiment (six male, 40 female; age range 18-29 years). All participants were right-handed, were native Dutch speakers and had no reading problems. Compensation was 25 euros or course credit.

Materials

Stimuli were taken from the study by Francken and colleagues (2014). Participants had to respond to random-dot motion stimuli, indicating whether dots moved upward or downward. Stimuli were preceded by a motion word or a neutral word. Motion words were previously validated (Francken et al., 2014) and consisted of five verbs describing upward motion (grow,

ascend, rise, climb, go up), five verbs describing downward motion (sink, descend, drop, dive, go down) and ten neutral verbs (bet, mourn, exchange, glow, film, rest, cost, sweat, wish, relax). The

implied direction of a motion word could be congruent or incongruent with the direction of dot movement. Motion words were forward masked on all trials. Motion words were backward masked on half of the trials (strongly masked) and not backward masked on other trials (weakly masked), leading to low and high readability, respectively. In 10% of trials a catch trial was included on which participants had to indicate whether or not the presented verb implied motion, to check whether participants were able to consciously read the motion word. Neutral verbs were only presented on 50% of catch trials and these trials were not used for further analysis. The trial design including the exact duration of the presentation of stimuli is illustrated in Figure 1.

Stimuli were presented on an ASUS computer monitor (screen resolution 1920 x 1080, refresh rate 60 Hz, screen size 50.9 x 28.6 cm) using MATLAB (MathWorks). Both motion stimuli

(13)

and motion words were presented in white (luminance 220 cd/m2) on a grey background

(luminance 38 cd/m2).

Figure 1. Trial design.

Procedure

Participants signed up for two sessions of 75 minutes in a two to nine day interval. Subjects signed an informed consent and were seated at 60 centimeters from a computer screen. They received verbal instructions from the experimenter as well as written instructions on the screen. Subjects were instructed to maintain fixation on the center of the screen. Participants used their right index finger and right middle finger to press the K and L keys on a keyboard to indicate motion direction (K representing upward motion and L representing downward motion or vice versa, counterbalanced across subjects). They were instructed to do this as fast but as accurately as possible. In the first session consisted of training blocks in which an adaptive staircasing procedure

(14)

was used, so that the motion coherence of the moving dots led the subject to judge upward or downward movement of stimuli correctly on approximately 75% of trials. This procedure was similar to the Francken et al. (2014) study and was used for each visual field separately. This way task difficulty and performance was similar for all subjects. The second session was used for actual data collection and consisted of one practice block and ten experimental blocks of 80 trials each. At the end of each block, participants received feedback on the percentage of correct responses.

Data analysis

Reaction times and accuracy (percentage correct responses) were measured for each subject. Based on accuracy measures, perceptual sensitivity was calculated, a signal-detection-theoretic measure of a subject’s ability for stimulus discrimination (Macmillan & Creelman, 2005). Perceptual sensitivity was calculated as d’ = z(H) – z(F), with H being the hit rate and F being the false alarm rate, transformed into a z-score. Perceptual sensitivity should be around zero for strongly masked trials, indicating that the motion word has not reached conscious awareness, and should be above zero for weakly masked trials, indicating that the meaning of the motion word has reached conscious awareness. For both reaction times and accuracy, 2 (Congruency) x 2 (Visual field) x 2 (Awareness) repeated measures ANOVAs will be conducted.

Results

Data of eight of the 46 participants was excluded from analysis because of incomplete data sets: three because of incorrect button presses and subsequent blocking of the experiment (due to an error in the MATLAB script that was fixed later on), five because they did not show up for the

(15)

second session. Data of 38 participants (two male, 36 female, age range 18-29 years) was included in the final analysis.

Perceptual sensitivity

Perceptual sensitivity was, as expected, above zero for weakly masked trials (M = 2.27, SD = 1.11), t37 = 12.22, p < .001, but also slightly above zero for strongly masked trials (M = 0.24, SD

=0.41), t37 = 3.47, p = .001. This indicates that motion words were consciously readable on weakly

masked trials, but were also readable on some strongly masked trials, although performance was higher on the catch trials for weakly masked words than for strongly masked words, t37 = 10.74, p <

.001.

Reaction times

A 2 (Congruency) x 2 (Visual field) x 2 (Awareness) repeated measures ANOVA was conducted for reaction times. As expected, a main effect for congruency was found, with reaction times being faster on congruent than on incongruent trials, F1, 37 = 89.92, p < .001. Also, a main

effect for awareness was found, F1, 37 = 24.50, p < .001, indicating that participants are faster on

unconscious than on conscious trials. In contrast to what Francken and colleagues (2014) found, there was no main effect for visual field (F1, 37 = 0.77, p = .39), meaning that no differences in

reaction times were found between responses to stimuli in the left and right visual fields. No significant three-way interaction was found (F1, 37 = 0.77, p = .39), implying that the congruency-

and awareness-effects did not differ across visual fields.

There was a significant interaction between congruency and awareness, F1, 37 = 63.97, p <

.001. The congruency effect was larger in the conscious condition (F1, 37 = 88.04, p < .001) than in

the unconscious condition (F1, 37 = 5.48, p < .05), see Figure 2. Awareness of the motion word seems

(16)

to facilitate a response when a motion word is congruent with a stimulus, and to interfere with a response when a motion word is incongruent with a stimulus.

Figure 2. Reaction times for weakly and strongly masked motion words, on congruent and incongruent trials in both the left and right visual field. Error bars represent the standard error.

Accuracy

Similarly, a 2 (Congruency) x 2 (Visual field) x 2 (Awareness) repeated measures ANOVA was conducted for response accuracy, the percentage correct responses. As expected, a main effect for congruency was found, with accuracy being higher on congruent than on incongruent trials, F1, 37

= 63.34, p < .001. Here, no main effect for awareness was found (F1, 37 = 0.86, p = .36), nor was a

main effect for visual field (F1, 37 = 2.51, p = .12). Also, no significant three-way interaction was

found (F1, 37 = 0.53, p = .47), indicating that the effects were similar across visual fields.

250 260 270 280 290 300 310 320 330 340 350 360 370 380 390 400 Lef Re ac tion time (ms )

250 260 270 280 290 300 310 320 330 340 350 360 370 380 390 400 410 420

Left visual field Right visual field

Re ac tion time (ms )

Weakly masked motion words

Congruent trials Incongruent trials

(17)

There was a significant interaction between congruency and awareness, F1, 37 = 57.19, p <

.001). In the conscious condition subjects responded correctly more often on congruent trials than on incongruent trials (F1, 37 = 44.19, p < .001) and crucially, this was also the case for the

unconscious condition, although the effect was smaller (F1, 37 = 11.53, p < .01, see Figure 3). Here

too, awareness of the motion word seems to facilitate responding correctly when the motion word is congruent with the stimulus, and to interfere with responding correctly when the motion word is incongruent with the stimulus.

0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100

Left visual field Right visual field

Per cen ta ge c or rec t

Weakly masked motion words

Congruent trials Incongruent trials

(18)

Figure 3. Percentages of correct responses for weakly and strongly masked motion words, on congruent and incongruent trials in both the left and right visual field. Error bars represent the standard error.

Discussion

In the current study, the influence of conscious and unconscious language on motion perception was investigated. When the implied direction of motion words was congruent with the direction of movement, reaction times were faster and accuracy was higher than when motion words were incongruent with motion stimuli. This difference was larger when words reached conscious awareness than when words did not reach conscious awareness, even though calculations of perceptual sensitivity revealed that not all strongly masked words were fully unconscious. Surprisingly, the effects were the same for stimuli presented in the left and in the right visual field

0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100

Left visual field Right visual field

Per cen ta ge c or rec t

Strongly masked motion words

Congruent trials Incongruent trials

(19)

and there were no interactions between awareness and visual field, unlike what was found in some previous studies (Francken et al., 2014).

The calculated perceptual sensitivity revealed that not all strongly masked words were processed fully out of awareness, as indicated by subjects performing slightly above chance levels on catch trials. Nevertheless, performance was better and the effects were stronger for weakly masked trials than for strongly masked trials, possibly indicating that awareness of the motion words does play a role here. Differences between trials on which words were consciously processed and trials on which words were supposed to be unconsciously processed may perchance still be due to qualitative differences in the level of processing. Possibly, strongly masked motion words merely lead to a slower effect than weakly masked motion words, similar to strongly masked stimuli leading to slower responses in a response inhibition task (Van Gaal, Ridderinkhof, Scholte, & Lamme, 2010).

The debate whether or not language influences perception differently in the left visual field than in the right visual field is still going on and evidence points in different directions. Francken et al. (2014) found differences in both reaction times and accuracy, but in the current study similar results were not found. Possibly, this is due to the fact that in the present study subjects were instructed to attend to the motion words and to try and read them, as there was always a chance of a catch trial occurring on which subjects had to indicate whether or not the presented word implied motion. In the study by Francken and colleagues, however, subjects did not receive any instructions regarding the motion words, and no catch trials were included in the design. Evidence suggests that when attention is paid to stimuli that are perceived unconsciously, these stimuli have a larger influence on behaviour (Naccache, Blandin, & Dehaene, 2002). Therefore, a possible explanation for the non-existent effect for visual field in the current study lies here. Further research is needed to investigate to what extent attention may play a role in this process.

(20)

Another possible explanation for not finding the visual field effect could be that in the current study, motion words were presented in the center of the screen. Perhaps the neural activation elicited by a motion word would remain more local when it is presented in one visual field only and consequently lead to stronger effects. However, to prevent eye movement to the side of the screen on which the motion word is presented, the presentation of the subsequent motion stimulus should also be presented to the opposite visual field on half of the trials, requiring much more trials in the experiment.

It is hypothesized that conscious perception is tightly linked to parieto-frontal activity (Dehaene, Changeux, Naccache, Sackur, & Sergent, 2006). Perhaps this indicates more widespread activation, whereas the early, lateral Whorfian effects seem to be more local, as they occur only in the left hemisphere and not in the right (Dehaene et al., 2001; Francken et al., 2014). Further studies using neuroimaging methods may provide answers to the question of how and where awareness interacts with language processing and perception.

Klemfuss, Prinzmetal, and Ivry (2012) emphasize that although language may influence performance on perceptual tasks, this does not necessarily mean that language really changes perception. Care must be taken when drawing conclusions from such experiments. As demonstrated in their experiment, language can influence perception both positively and negatively. Language may have a positive influence when linguistic labels are used to classify stimuli that are familiar or easily named, but may have a negative influence when linguistic labels are task-irrelevant. Further research is needed to draw clear-cut conclusions about the levels of processing on which effects occur and to be enabled to speak of causal relationships.

In conclusion, language seems to influence perception, or at least interact with it. This seems to occur automatically, probably without language having to have reached conscious awareness, although there seems to be an influence of the level of processing. It is not yet clear to

(21)

what extent this is influenced by attention, leaving the relationship between attention and language processing to be further investigated.

References

Aarts, H., Custers, R., & Marien, H. (2008). Preparing and motivating behaviouur outside of awareness. Science, 319, 1639.

Berti, S. (2011). The attentional blink demonstrates automatic deviance processing in vision. Neuroreport, 22, 13, 664-667.

Boutonnet, B., Dering, B., Vinas-Guasch, N., & Thierry, G. (2013). Seeing objects through the language glass. Journal of Cognitive Neuroscience, 25, 10, 1702–1710.

De Groot, A. M. B. (2010). Language and cognition in bilinguals and multilinguals. New York: Psychology Press.

Dehaene, S., Changeux, J.-P., Naccache, L., Sackur, J., & Sergent, C. (2006). Conscious, preconscious, and subliminal processing: A testable taxonomy. Trends in Cognitive Sciences, 10, 5, 204-211. Dehaene, S., Naccache, L., Cohen, L., Le Bihan, D., Mangin, J. F., Poline, J. B., & Rivière, D. (2001).

Cerebral mechanisms of word masking and unconscious repetition priming. Nature

Neuroscience, 4, 7, 752-758.

Francken, J. K., Kok, P., Hagoort, P., & De Lange, F. P. (2014). The behavioural and neural effects of language on motion perception. (Submitted).

Gilbert, A. L., Regier, T., Kay, P., & Ivry, R. B. (2008). Support for lateralization of the Whorf effect beyond the realm of colour discrimination. Brain and Language, 105, 2, 91-98.

Klemfuss, N., Prinzmetal, W., & Ivry, R. B. (2012). How does language change perception: a cautionary note. Frontiers in Psychology, 3, 78, 1-6.

(22)

Lupyan, G., & Ward, E. J. (2013). Language can boost otherwise unseen objects into perception.

Proceedings of the National Academy of Sciences, 110, 35, 14196–14201.

Macmillan, N. A., & Creelman, C. D. (2005). Detection Theory: A user's guide (2nd edition). Mahwah, N.J.: Lawrence Erlbaum Associates.

Meteyard, L., Bahrami, B., & Vigliocco, G. (2007). Motion detection and motion verbs: Language affects low-level visual perception. Psychological Science, 18, 11, 1007-1013.

Mo, L., Xu, G., Kay, P., & Tan, L. H. (2011). Electrophysiological evidence for the left-lateralized effect of language on preattentive categorical perception of colour. Proceedings of the

National Academy of Sciences, 108, 34, 14026-14030.

Naccache, L., Blandin, E., & Dehaene, S. (2002). Unconscious masked priming depends on temporal attention. Psychological Science, 13, 416–424.

Pavan, A., Skujevskis, M., & Baggio, G. (2013). Motion words selectively modulate direction discrimination sensitivity for threshold motion. Frontiers in Human Neuroscience, 7, 134, 1-9.

Ramachandran, V. S. (2010). The Tell-Tale Brain. New York/London: W. W. Norton & Company. Regier, T., & Kay, P. (2009). Language, thought, and colour: Whorf was half right. Trends in

Cognitive Sciences, 13, 10, 439-446.

Thierry, G., Athanasopoulos, P., Wiggett, A., Dering, B., & Kuipers, J.-R. (2009). Unconscious effects of language-specific terminology on preattentive colour perception. Proceedings of the

National Academy of Sciences, 106, 11, 4567-4570.

Van Gaal, S., & Lamme, V. A. F. (2011). Unconscious high-level information processing: Implications for neurobiological theories of consciousness. The Neuroscientist, 20, 10, 1-15.

Van Gaal, S., Ridderinkhof, K. R., Scholte, H. S., & Lamme, V. A. F. (2010). Unconscious activation of the prefrontal no-go network. The Journal of Neuroscience, 30, 11, 4143-4150.

(23)

Referenties

GERELATEERDE DOCUMENTEN

Kremer (2008) describes how the study to Low Saxon has changed over the years and that scientists only became interested in studying the dialects more after a standard version of

Tijd en ruimte om Sa- men te Beslissen is er niet altijd en er is niet altijd (afdoende) financiering en een vastomlijnd plan. Toch zijn er steeds meer initiatieven gericht op

The purpose of this research is to provide theoretical and practical knowledge about the conflicting institutional logics within organizations in the energy infrastructure

(Interestingly, only 11% of the Dutch audiobook users gave being able to consume more books as a reason, opposed to 40% in the APA survey. This could be due to cultural

The impact on water levels along the Waal branch of interventions with upstream kilometres 6 (intervention 1) and 46 (intervention 2) if the discharge distribution (Q) is free (red)

The purpose of the study is twofold; firstly, to use data envelopment analysis (DEA) to estimate the technical efficiencies of Johannesburg Stock Exchange (JSE)-listed

We tested whether political orientation and/or extremism predicted the emotional tone, including anger, sadness, and anxiety, of the language in Twitter tweets (Study 1) and publicity

Zur Frage der Entstehung Maligner Tumoren (Fischer). Castellanos, E., Dominguez, P., and Gonzalez, C. Centrosome dysfunction in Drosophila neural stem cells causes tumors that are