The Harley Effect:


Internal and external factors that facilitate positive experiences with                                                      product sounds



                                                                                  Elif Özcan

To love or not to love machinery sounds

Imagine the sound of a roaring engine; it draws your attention and causes you to become alert. Would you consider this a pleasant sound? Many people would agree that engine sounds are typically not pleasant sounds in the domain of environmental sounds. They would not listen to them for mere pleasure. Now imagine that the roaring engine sound comes from a motorbike. Would this sound scenario be more agreeable? Perhaps, knowing the exact sound source and its function might help one appreciate what is heard. And what if the roaring sound came from a Harley Davidson? Now, how excited and pleased would you be? Some would say they love the sound of a Harley because it represents freedom and adventure (see Figure 1, the Harley effect and VideoObject 1 and VideoObject 2).

 

Sounds, being abundant in everyday activities, have great potential to evoke emotional responses. This is especially true for mechanical sounds (Bijsterveld 2008). Some people are pleased to hear a Harley Davidson motorbike accelerating, yet they dislike a buzzing toothbrush, or become disappointed by the tin-like sound of a car door closing. Either way, human emotional responses towards environmental sounds are more complex than psycho-acoustically predicted. People may dislike the most harmonic sounds, such as certain types of music, or enjoy the most alarming sounds, such as fireworks or an accelerating Harley Davidson. These daily sonic events demonstrate that the acoustical disposition of sounds is insufficient in predicting how pleasant or unpleasant a sound is. It also demonstrates that the circumstances surrounding the sounds events one hears influence judgments regarding those sounds. The phenomenon that typically unpleasant sounds can be experienced as positive is an inspiring topic for product sound design research. 

Figure 1: The Harley-effect explains that appreciation of product sounds depends on the contextual effects a product has to offer. For example, Harley Davidson motorbikes with their V-twin engines (left) are famous for their deep, rumbling sound. Harley Davidson, as a brand, is also associated with freedom and adventure. Honda motorbikes, on the other hand, with their four-cylinder engines (right) are famous for their smooth sound. Honda, as a brand, is associated with being reliable, controlled, and advanced in its use of technology. Appreciation of both motorbike sounds is influenced by the image created through branding efforts, the people riding them, and the environment in which they are used.

There is a growing interest in cultivating positive experiences with products because products have the potential to facilitate people’s wellbeing and to create more pleasurable circumstance for otherwise mundane tasks (Desmet 2012; Hekkert and Leder 2008; Jordan 2000; Norman 2004). Designing for the senses, emotions, visual aesthetics, ease of use, and specific meaning have been shown to be positive design strategies. Within these strategies, the impact of sensory experience (e.g., auditory or visual) on other product experiences (e.g., usability) has scarcely been examined in direct link to a product's perceived pleasantness. For digital products and services it has been demonstrated (Hassenzahl 2004; Moshagen, Musch and Göritz 2009; Tractinsky, Katz, and Ikar 2000) that visual pleasure has a positive effect on the utilitarian function of human-product interactions (e.g., user performance or usability). Pleasure and functionality (i.e., hedonic-utilitarian consumer attitudes) are also considered as a duality in the consumer decision-making process (Batra and Ahtola 1991; Spangenberg, Voss, and Crowley 1997). Thus, not considering product sounds in this vein would be a missed opportunity.

 

This paper will discuss the phenomenon that people can enjoy unpleasant product sounds and provides theoretical support for this notion. In this vein, it will illustrate circumstances (internal and external factors) that will lead to positive experiences with product sounds. The aim of this paper is to demonstrate that in the field of auditory research the acoustical disposition of sounds is insufficient to predict auditory pleasantness, which is highly susceptible to source-related context, as demonstrated through the previously mentioned Harley Davidson example. Finally, the paper will present implications that might influence future product designers. As a result, a new approach to sound design will be presented that highlights how designers can manipulate product contexts along with the traditional constructive aspects of products in order to create better product experiences involving sound.

VideoObject 1: Harley Davidson

VideoObject 2: Honda

Unpleasant auditory experiences

Affective experiences vary from complex emotions (e.g., pride, desire, frustration, anger) to basic sensations (liking/disliking) and have two essential dimensions: valence and arousal. Valence refers to the subjective feeling of pleasantness or unpleasantness and defines the hedonic value of an experience; arousal refers to a subjective feeling of activated or deactivated and defines the intensity of an experience (Russell 2003; Russell 1980). These basic sensations and emotions can be caused by any stimulus, object, or event. People’s daily experiences with product sounds (mixers, dental-drills, shavers) indicate that product sounds are phenomenologically experienced as “annoying, obtrusive, and irritating” (Özcan and Van Egmond 2012). These affective experiences are situated on the negative side of the valence dimension, providing evidence for unpleasant experiences. 

 

Negative associations can be explained by understanding people’s emotional and cognitive state of mind when they interact with product sounds (Cox 2008; Halpern, Blake and Hillenbrand 1986; Özcan and Van Egmond 2012). Andringa and Lanser (Andringa and Lanser 2013) suggest that people react to “noise” according to the following judgment schemes (i.e., appraisals): motivations (how important is the sound event, or how advantageous is it?), agency (who is responsible - I or others?), coping potential (do I have the means to cope with a situation and pursue a goal?) and future expectancy (how optimistic or pessimistic is someone towards the outcome?) (Kuppens, Champagne and Tuerlinckx 2012). Although product sounds are often informative about the current state of product use (e.g., the sound of a mixer changes according to the required speed), they can also stay constant in an environment without providing further information (e.g., a kitchen extractor fan). Such constant sounds can be annoying because they may interfere with other needs people have or goals they want to pursue (e.g., having a conversation or relaxing), or because one fails to evaluate the advantage of constant sound. Being unable to manipulate the quality of the product sound can create a pessimistic state of mind, which can also cause discomfort. In short, one can positively or negatively appraise a product sound in the context of its beneficial value to one’s needs and concerns.

 

Because sound is an integral property of all products (Özcan 2008), emotional responses to sounds become reflections on both the sound itself and the product. Thus, perceiving the sound as unpleasant also negatively influences how the product is perceived. Sensory experiences of auditory and visual information separately contribute to the overall product experience. The overall pleasantness of a product benefits from or is hindered by auditory (un)pleasantness. Accordingly, manufacturers are often concerned that unpleasant sounds decrease overall product experiences and consequently aim to improve the acoustic quality of their products (Susini, McAdams, Winsberg, Perry, Viellard and Rodet 2004; Van Egmond 2008; Västfjäll, Kleiner and Gärling 2003). Most manufacturers tend to improve the acoustic quality of a product through the traditional methods based on auditory perception theories, which tackle sound as an independent entity.

 

The field of auditory perception focuses on the sound-evoked basic sensations that cause an affective response (i.e., sensory [dis]pleasure). Therefore, to enhance sensory pleasure, the acoustical composition of the sounds and their psycho-acoustical effect need to be understood. In basic psycho-acoustical terms, there is strong agreement in the listeners’ responses towards auditory stimuli: The higher the perceived sharpness, loudness, roughness and noisiness of a sound is, the lower the sensory pleasantness (Zwicker and Fastl 1990). To be able to design and manufacture pleasant future products, acoustic and psycho-acoustic studies defining the threshold of hearing have been sources of inspiration to product development teams (Aures 1985; Berglund and Lindvall 1995; Bisping 1997; Zwicker and Fastl 1990). For example, Susini et al. (Susini et al. 2004) investigated acoustical parameters (noise to harmonic ratio, the spectral center of gravity and loudness) that underlie preferences for different brands and models of air conditioners and their associated noises. Their approach provided a basic sensory profile for air-conditioning sounds for future designs of air-conditioning units. There are but a few studies that showcase the fact that by carefully engineering the constructive parts of products, designers can not only increase the perceived pleasantness of the occurring sound but also strengthen its semantic link to desirable contexts of product use (Lageat, Czellar and Laurent 2003; Van Balken 2002). Similarly, Van Balken showed that the configuration of electric and mechanical parts of a coffeemaker could be reorganized in order to create a pleasing coffeemaker sound which also expresses the notion of an “exclusive afternoon with guests.”

 

However, sensory pleasantness is not necessarily indicative of positive experiences. Sound design based solely on adjustments to acoustical parameters would be merely a partial solution in most situations. Because a product and its sound coexist, they influence each other and should be designed together. But from an engineering perspective, it is impossible to turn machinery sounds into harmonic sounds due to the limitations of the constructive parts of a product and the mechanical sounds that naturally occur. Thus, a new approach based on experience is needed for creating positive emotional responses towards product sounds rather than only ensuring sensory pleasantness. 

Contextual effects on auditory perception

Early evidences in psychology research suggest that context and meaning of an object influence auditory perception and specifically the elicited emotions (Bradley and Lang 2000; Cox 2008; McDermott 2011). Bradley and Lang showed that it is the meaning of sounds rather than the acoustical composition of sounds that elicit emotions. Cox provided further support for Bradley and Lang’s results and demonstrated that colors (varying in hue and brightness) presented with sounds also influence people’s perception of aversive sounds. McDermott suggested that context (associations between sounds and events in the world that cause them) matters in the way people interact with sounds, and that people also find familiar sounds more agreeable. 

 

Furthermore, psychology and acoustics studies revealed that negative emotional states intensify (Asutay and Västfjäll 2012; Dess and Edelheit 1998; Pollock, Carter, Amir, and Marks 2006; Siegel and Stefanucci 2011; Wang, Nicol, Skoe, Sams and Kraus 2008) and emotion-free meaningful visual contexts moderate (Abe, Ozawa, Suzuki, and Sone 1999; Fastl 2004; Hashimoto and Hatano 2001) the perceived quality of auditory properties. For example, if a person feels anxious, the perceived loudness of a sound will be higher (Asutay and Västfjäll 2012; Sigele and Stefanucci 2011); consequently, the resulting sensory experience will also influence the elicited emotional responses. By presenting pictures unrelated to the sound event but plausible to its context of occurrence, Fastl (2004) observed that images that have a more pleasant content moderate the perceived pleasantness of sounds. In a study that considers the social implications of noisy environmental sounds (e.g., transportation noise), Maris (2008) states that acoustical factors (e.g. sound pressure level) and non-acoustical factors (perceived control over noise, individual’s noise sensitivity and personality traits, and attitudes towards sound source) equally contribute to noise annoyance. Maris also found that if people are involved in managing the production of the noise or sound source, the annoyance ratings significantly dropped. 

 

Overall, a person’s current emotional state, mindset, past experiences, and expectations influence how a sound is perceived and assigned a meaning. Excepting the suggestions of Cox, McDermott, and Maris, such contextual influences on auditory pleasantness judgments have not been systematically studied in the field of product (sound) design and development. Taking “context”, “product”, and “meaning” into account for understanding emotional experiences with product sounds is not an established practice in the field of product sound design and should be further investigated. 

The Harley Effect

The Harley Davidson sound is a good case for observing the moderating effect of external factors on auditory pleasantness (see Figure 1 and the first paragraph). The same phenomenon applies to sounds caused by products such as espresso machines, sports cars, and fireworks. Having a closer look at these examples provides us with the insight that the desire people feel for typically (i.e. psycho-acoustically) unpleasant sounds derives often from the desirable situations in which the products, and consequently, their sounds co-occur. With the sound of an espresso machine, people might be craving the taste of coffee or the quiet moments one has while drinking coffee; with a sports car, it might be the sense of being in control of such fast machinery; and with the fireworks it could be the celebration rituals that override the natural unpleasantness of the sound. Possible reasons for such contextual effects and a consequent reversal in auditory pleasantness experiences will be explained below.

 

Fundamentally, the Harley effect can be examined through two parallel processes: auditory experience and product experience. Auditory experience is about how the brain perceives the sound psycho-acoustically (Bisping 1997; Susini et al. 2004; Västfjäll, Gulbol, Kleiner and Gärling 2002), behaviorally (Edworthy, Hellier and Hards 1995; Haas and Edworthy 2006) and semantically (Bergman, Sköld, Västfjäll and Fransson 2009; Özcan 2008), and then responds to it. Product experience is about how users interact with and respond to products on the basis of the product’s utilitarian function and the user’s concerns. The resulting experience defines the functional and affective values of the product. That is, users (i) identify the product and its function through perceptual and cognitive processing in a context and (ii) provide an affective response (i.e. aesthetic, semantic, and/or emotional) (Desmet 2012). Although parallel, the auditory experience and product experience may or may not result in similar affective responses. 

 

Thus, the processing of sound and product must be intertwined for a “negative-to-positive reversal” to happen. Previous studies (Özcan 2008; Özcan and Van Egmond 2007; Özcan and Van Egmond 2009; Spence and Zampini 2006) demonstrated that perceptual and cognitive processing of sounds (encoding, recognition, remembering, and labeling) interacts with product-related visual or verbal information. On neurological grounds, Bar (2004) explains that if different objects belong to one particular concept then both semantic associations and the perception of these objects become automatically linked. Take, for example, motorbike riding as a concept. Within this concept, other concepts such as motorbikes, motorbike brands and models, accessories, bikers, roads, landscapes, engine sounds, exhaust fumes, etc. are simultaneously experienced during a ride. Consequently, re-experiencing (seeing, hearing, smelling, and talking about) one of these concepts could automatically activate the semantic knowledge and perceptual representation (images, sounds, and motor-actions) of other linked concepts. Thus, in such interactions, cross-modal processing takes place. Cross-modal interactions are often found for perceptual processes (McGurk effect), conceptual processes (dual coding), and semantic-to-perceptual processes (verbal overshadowing). Such interactions have scarcely been investigated for sensory processes incorporating auditory pleasantness (Spreckelmeyer, Kutas, Urbach, Altenmüller and Münte 2006).

Framework of auditory pleasantness experience 

Following up on the aforementioned argument for the Harley effect (i.e., the moderating effect of external factors on auditory pleasantness), Figure 2 presents the framework of auditory pleasantness experience. The framework shows that product sounds cause an affective auditory experience and this affective experience is transformed into a “certain” overall auditory experience with the mutual influence of internal and external factors. The main components of the framework are the affective and overall experiences of product sounds and the internal and external factors that influence these experiences. Affective experience, in general, refers to the first forms of pleasantness experiences occurring in the early emotional processing of objects (Desmet 2002; Gross and Barrett 2011; Russell 2003). Affective experience is an important perceptual process through which physical characteristics of the object are appraised on their beneficial value (Frijda 1986). The resulting experience is sensory pleasure or displeasure. In everyday situations, objects are always presented in contextual scenes which affect the appraisal of the sensory experiences. The same argument accounts for product sounds. A sound that is perceived and affectively experienced will also be attributed a meaning and appraised in its actual context. Thus, as a result, people form an overall auditory experience that is defined by affective experiences with sound and cognitive judgments on human-product interactions. This paper focuses on this early stage of experiences that defines the affective tone of auditory experiences and the basic reactions towards it. As emotional processes are adaptive to an object’s nature and its environmental presentation (Gross and Barrett 2011), the overall auditory experience can be manipulated by changing the input to the affective experience. 

 

Main inputs to trigger the affective auditory experience are (i) internal factors which are intrinsic to the sound quality and define the sensory pleasantness of the product sound and (ii) external factors which result from a product-related context and have the potential to improve the auditory pleasantness experiences. The hypothesis is that the tone and strength of the auditory pleasantness experience depends on both the acoustic nature pre-defined by internal factors and the level of interaction caused by external factors. In other words, external factors have the capacity to alter the pleasantness experience of product sounds. 

 

Internal factors that influence auditory pleasantness experiences are not different than those that have been studied through the acoustic and psycho-acoustical measurement of sounds. Spectral-temporal disposition of sounds play an important role on sensory liking and psycho-acoustical responses (Zwicker and Fastl 1990). External factors on the other hand derive from the overall product experiences and are dependent on the trilogy of user, product, and context of the user-product interaction. The user is central to the framework because human-product interactions are appraised by users according to their concerns (goals, standards, attitudes, motivations) with respect to the product (e.g. gaining mastery of the product, expressing one’s identity through ownership of the product, craving for a quiet time despite the functioning of the product). Of course non-users can also experience product sounds and provide affective responses; their experience will most likely be affected by their interactions with and concerns regarding the sound but not the product. Therefore, this framework incorporates both users and non-users who are exposed to products and affectively respond to them.

Figure 2: Framework of auditory pleasantness experience.

Deconstructing the framework for auditory pleasantness experiences

The three stages of the Harley effect (i.e. auditory pleasantness experience) are shown below. They are discussed in a gradual fashion and present possible external factors and opportunities for research and design (follow Figure 3). Stage 1 deals with the pleasantness experiences in the absence of any external interventions, which is how sensory pleasantness is predominantly measured in the current practice of sound design: the user responds to the sound but not to the product. Stage 2 introduces product use as an intervention and deals with how users’ concerns with the utilitarian function of the product affect the auditory pleasantness experience. Stage 3 broadens the scope of the intervention by introducing the affective quality of the context in which human-product interactions occur. Here, context refers to the affective associations elicited by both the product and the circumstances (e.g. situations, events, locations, other people). At each stage, the auditory pleasantness experience will gradually improve from negative experiences to positive and even to desirable ones.

Stage 1 – Sound Only

 

When one considers sound as a separate entity detached from its source, then people pay attention to its tonal structures and sensory qualities, as if listening to music (Gaver 1993). The notion of musical listening in this framework refers to being exposed to sound in a non-existent relationship with its source. In this sense, musical listening is similar to acousmatic listening. Ecologically, such experiences occur when people are uninvolved in product usage but share the same environment in which the sound is produced. For example, pedestrians are disturbed by motorbike sounds, whereas actual riders appreciate it. Exposure to a product sound in the absence of its source also causes ambiguity in identifying its source (Özcan and Van Egmond 2009). For example, a high-pitched, loud, and continuous machinery sound coming from the kitchen could refer to a hand blender, mixer, or an electric blade. However, because there is no certainty in correctly identifying the sound source, the sound itself becomes “the object” to perceive and respond to (AudioObject 1).  Therefore, at this stage, the perception of product sounds will give rise to judgments on the spectral-temporal properties (Björk 1985; Özcan and Van Egmond 2012; Susini et al. 2004; Von Bismarck 1974). The words often used to describe the (psycho)acoustical quality of product sounds include droning, high pitch, sharp, rough, loud, noisy, continuous, etc. These descriptions indicate that people attribute negative associations to product sounds in the absence of the product. Thus, a lack of human-product interaction could result in displeasure on a sensory level.

Stage 2 – The Utilitarian Product

 

When one links a sound directly to the product, then “everyday listening” takes over (Gaver 1993). Ecologically, everyday listening treats sound as an event with a clear function in an environment (Klatzky, Pai and Krotkov 2000; Kubovy and Van Valkenburg 2001; McAdams 1993). For example, hearing an alarm clock urges one to get up; hearing a motorbike helps one estimate its power, size, speed, and location. In everyday experiences with sound, humans instinctively give importance to the sound source (Marcell, Borella, Greene, Kerr and Rogers 2000; Özcan and Van Egmond 2012; Saygin, Dick and Bates 2008; Vanderveer 1979). Thus, the meaning underlying a sound event (e.g. fireworks are associated with celebration) indicates how pleasurable the sound is (Bradley and Lang 2000). The majority of our affective experiences with product sounds result from the functional assessments of the product. Consequently, it will be pleasing to interact with a product that responds to our concerns. If an alarm clock manages to wake us up from deep sleep, we will be pleased with the sharp sound, or if a motorbike sounds powerful when riding, we will be happy with its performance. In such cases, sound is perceived as complementary to a product’s function. Therefore, incorporating the utilitarian value of a product may slightly shift the affective responses from unpleasant to acceptable. Consequently, the sound will be experienced as “functional”. The strength of the pleasantness experience depends on how pleasurable the product is in terms of its function. For example, in Figure 4, if an average coffee machine (e.g. a Philips Senseo) pours coffee, we may just accept the occurring sound because it is part of the coffee making experience; if an espresso machine (e.g. a Delonghi ESAM) pours coffee with foam (e.g. cappuccino), we may enjoy the power in the sound because it denotes good quality. (Listen to the sounds in VideoObject 3 and VideoObject 4.)

Stage 3 – The Affective Context 

 

Product sounds are prone to contextual effects (e.g. location, product image/label, product-related objects) (Özcan and Van Egmond 2009). Such dependency occurs when a sound is experienced in relation to a product in a certain environment. Because product experiences result in meaningful inferences and emotional responses (Desmet and Hekkert 2007), upon hearing the sound, higher-level concepts pertaining to emotions and semantic associations (e.g. excitement, danger, adventure, being outdoors, feeling liberated) become salient (Cummings, Ceponiene, Koyama, Saygin, Townsend and Dick 2006; Orgs, Lange, Dombrowski and Heil 2006). Industrial companies aim at creating affect-laden contexts with visual and/or verbal semantic input for product placement in the market: Harley Davidson uses arousing images that are associated with adventure in their publicity, while Apple combines function with aesthetics (Figure 5). The Harley Davidson advertisement uses bright red and orange colors, representing fire and symbolizing danger, in the background; in addition, the slogan prompts potential users to earn their freedom. The Apple advertisement uses a single color as background to highlight the visual aesthetics of the computer. Both advertisements visually and verbally tap into people’s needs, beyond operational functionality. Such exposure to desirable experiences or even established brand culture (e.g. logos, slogans) shape expectations regarding product experiences. For example, Fitszimons, Chartrand, and Fitszimons (2008) have shown that participants responded to brands by behaving in line with the brand’s characteristics with no conscious awareness of the influence. Participants exposed to the Apple brand showed more creative tendencies in a follow up questionnaire compared to participants exposed to the IBM brand. The affective value of a product-related context may, therefore, overshadow sensory unpleasantness and shift judgments from mere “acceptability” to “pleasure”. Consequently, in the same vein of thought, sound will also be experienced “semantically congruent” with the product context.

As illustrated, auditory pleasantness experience can be a floating mechanism guided by internal and external factors: sound, (non)user, product, and context. Because the framework in Figure 3 presents a new approach to understanding the experience of pleasantness with respect to product sounds, the following sections will reflect on the proposed framework and discuss its relevance to the field of industrial design.

Figure 5a (left) and Figure 5b (right): Two affect-laden product advertisements. Harley Davidson (left) uses adventurous images in their publicity, while Apple (right) combines function with aesthetics. The images used to introduce the products to potential users influence the semantic and affective reactions of the users to the product and its properties.

 

Figure 4a (left) and Figure 4b (right): The appreciation of the coffeemaker sounds may depend on the function of the product. A Philips Senseo (on the left) makes coffee of average quality and a Delonghi ESAM 4500 (on the right) makes espresso and cappuccino with foam.  

 

Figure 3: Auditory pleasantness experiences presented as a function of three stages of product-related interventions influencing affective and overall auditory experiences.

 

AudioObject 1: Listen to this product sound and observe your sensory and emotional responses to it. Do you know what product causes the sound? How pleasant is it?

 

VideoObject 4: Delonghi ESAM 4500

VideoObject 3: Philips Senseo

Theoretical reflections regarding the framework of auditory pleasantness experiences

Currently, circumstances that help people enjoy their experiences with product sounds are largely unknown to design practitioners and/or multidisciplinary auditory researchers who intend to cultivate positive experiences with product sounds. The proposed framework has the function of initiating discussions on the topic of auditory pleasantness by presenting components of pleasantness experiences that can be manipulated in design practice. These discussions, with additional future research, will yield more established design guidelines for designers to find a balance between the acoustical properties of a sound and the semantic associations of the product. The aim is to create positive daily experiences for users by improving the quality of their interactions with products that inherently produce sounds. In this way, the value of sound in product experiences will be optimized.

 

A sound can be experienced in several ways. Depending on the current situation of the listeners (users and non-users) and their concerns, the pleasantness experience of the product sound can derive from any of the stages presented above or it may even derive from all of these stages. Earlier studies suggest that when listening to a sound, people adopt different strategies that result in functionally different auditory experiences. For example, Gaver (1993) – in his ecological account for environmental sounds – contrasted musical listening to everyday listening. Similarly, Chion (1994) discussed three types of listening in the context of film sounds: causal, semantic, and reduced listening. In a more recent study that subsumes Gaver’s and Chion’s discussions, Tuuri, Mistonen, and Pirhonen (2007) proposed an hierarchical scheme that includes preconscious, source-oriented, context-oriented, and reduced listening. These studies provide sufficient evidence for the proposed framework (i.e., listening is not a linear activity and is prone to external factors) by suggesting that the activity of listening is not only directed toward the sound but also toward the source and its environment. 

 

With the three stages outlined for listening, the intent is to gradually introduce external factors so that the effect of a product and a product-related context on an affective evaluation of product sounds can be progressively observed. It also has a methodological reasoning behind it. Each stage provides designers with different opportunities to cope with (unpleasant) sounds. Thus, different design methodologies or tools could eventually emerge in future studies. 

 

In the current practices, sound designers jump to quick conclusions with acousmatic (reduced or musical) listening of the sound. In acousmatic listening it is rather expected that one ignores the sound source and its effect on the perception of sound due to the absence of the source. Furthermore, sound designers often analyze the recorded sound through a computer in an environment that is not natural to the actual production of sound. By doing so, they gain mastery over the auditory quality of the sound, however, fail to observe the human-product interactions. By analyzing the sounds while observing how users interact with the product (as in causal listening), sound designers can enrich their possibilities to create better and human-centered design solutions. Moreover, the proposed framework could be of interest to the field of psychology, as the phenomenon described here is ecologically valid and applies to understanding and improving people's responses to other types of unpleasant stimuli in the context of interactions with meaningful objects.

Relevance for sound design 

Traditionally, engineers aim to fine-tune the psycho-acoustical parameters in order to create comfortable sensory experiences (Susini et al. 2004; Västfjäll, Gulbol, Kleiner and Gärling 2002; Västfjäll, Kleiner, and Gärling 2003). However, such an approach totally discards the ecological relevance of sounds and their purpose in our daily interactions with products, which is mainly based on utilitarian and hedonic functions (Bahtra and Ahtola 1991; Desmet and Hekkert 2007). The proposed framework tackles product sounds within product-related contexts, not as sounds detached from meaning and emotion. This topic in product sound perception explores the phenomenon that people can adapt to unpleasant sensory experiences congruent with a meaning- and emotion-laden context. As a result, design teams are offered several choices to systematically design product sounds. In this vein, how can (sound) designers make use of the external factors?

 

First of all, it is important and relevant for sound designers to thoroughly understand the occurring sound and its acoustic qualities through spectral-temporal analysis. It is necessary to define the regions in a sound event that are perceived as pleasant and unpleasant in order to pinpoint the problematic product parts that are causing unpleasant auditory experiences. This could be done via “pleasantness maps” (Laurans, Desmet and Hekkert 2009; Van Balken 2002), that is, mapping the pleasant and unpleasant areas on, e.g., a spectrogram of the sound with frequency and amplitude shown as a function of time. This map should also be coupled to the sound events that occur when the product is in use. Psycho-acoustical analysis of the sound (i.e. how rough, sharp, loud, and noisy the sound is perceived) will provide further evidence on how (un)pleasant the sound is from the perspective of the potential users. At this stage, designers have an analytical approach to the sound for which simple sound editing and (psycho)acoustical analysis tools could be used. Within this analysis, the involvement of the users is limited to their responses to sound but not to the products. For eventual listening tests, listeners could take part in questionnaires using basic psycho-acoustical terms and/or in pairwise comparison studies on the perceptual quality of sounds. At this point, the product components causing unpleasant sounds could be replaced or the construction of the product could be modified in order to ensure acoustically better results.

 

Secondly, the role of the sound within human-product interaction should also be clearly defined. Thus, user-oriented research (observations, interviews, and questionnaires) should take place in order to better understand how users interact with products and for what purposes they need the product. Is the sound simply providing information about the functional aspects of the product? Is there a dialog between the product and the user through sound? What happens in the absence of sound? In general, product-evoked functional experiences should facilitate the acceptability of the sound. New tools that especially facilitate observations into human-product interactions are needed. As a design strategy, the function of the product should be very clear to the user, and sound as an event should be complementary to the product function. Furthermore, the sound of a product should satisfy the users’ needs and correspond to the actions users take in the course of product use. 

 

Thirdly, designers can consider the semantic associations of the product. Product semantics can incorporate function and affective value at the same time (e.g. a cute hairdryer, an elegant coffee machine). Semantic analysis of the product could be done via words and images to establish what the product means for the user in terms of its functional and affective value. In addition, designers can consider how the environment relates to the affective value of the product. During human-product interactions, do other products, users, advertisements, or (micro)cultural issues influence the product experience and, consequently, the auditory experience? Similarly, is the affective value inherent to the product or is it imposed by objects, people, or situations that make up the affective context of the human-product interaction? Furthermore, the complementary role of the sound to the product semantics can also be established. For example, the sound of a sports car is complementary to the appearance of a sports car, and both product properties feed the dynamic nature of a sports car. For positioning the effect of sound in product experience, context mapping (Sleeswijk-Visser, Stappers, Van der Lugt and Sanders 2005) as a design tool could be used. With context mapping, the expectations of the users and their emotional state at the moment of human-product interaction are also considered. As a design strategy, the designers could focus on creating new, exciting, and desirable product experiences. A pleasant form, attractive colors, and smart materials, all of which are congruent with the functional and affective value of the product, will make the product experience special and desirable. Accordingly, sounds will gain semantic associations congruent with the product-related context (as in the example of the Harley Davidson).

 

The key to a successful product sound design is to simultaneously consider the physical limitations of product manufacturing on the acoustical properties of the occurring sound, and the kind of intervention (utilitarian or affective product value) that makes sense in the context of product use or interaction. Fine-tuning the acoustical properties of the product sound will create a pleasant sensory experience; however, thinking beyond the sound (that is, considering the concerns and expectations of the users with respect to the product-related context) will offer an advantage to the manufacturers in the market.  

 

However, product development teams in the industry may feel reluctant to re-design the product for the purpose of improved auditory quality because of the costs involved in having a parallel design process running next to the main design process. The proposed approach to sound design is still valid and beneficial because design teams – without deviating too much from the main design process – can pinpoint the problematic areas with the product sound at the prototyping phase. Consequently, if listeners agree that the working prototype causes unpleasant auditory experiences, the design teams may compensate for it by offering users visually and/or functionally more pleasing experiences. In such cases, marketing strategies could also be useful to provide desirable contexts for product use and interaction. For example, Nespresso has an established marketing campaign around the concept of “quick and easy” coffee making. Their strategy is to present Nespresso concept as “an object of desire” not only with an attractive product design but also with a well-considered product service around it. In their commercials (for example, Figure 6), Nespresso uses both seductive images of freshly made steaming coffee that represent indulgence and images of exclusive coffee drinking activities in which an attractive and conscientious public figure such as George Clooney appears. Although Nespresso coffee machines do not make the most pleasant coffee machine sounds (AudioObject 2) according to the expert critiques on the Internet, its sound does not seem to be an issue for the appreciation of the product.

Shortcomings and future studies

The discussion above has covered only the external and internal factors that (sound) designers can influence in order to change the perception of sound within the limitations of product development. There are also other external factors, such as social and cultural influences, technological trends and developments, universal needs, and personal goals, that have a far-reaching impact on product design. For example, Bijsterveld (2008) suggests that if technological sounds are interpreted as contributing to an increase in prosperity, then such sounds are considered less annoying. Furthermore, individual differences in listening styles or differences in auditory sensitivity have not been considered in this study; and neither have sounds with cultural values or historical meanings. All of these factors are outside the scope of the current paper, which solely challenges the existing practices of sound design habits by gradually introducing external factors familiar to design teams. However, it is definitely interesting to study further the socio-cultural influences on product sound perception, since products and their sounds are a vast component of our modern life styles. In future studies, the nature and type of internal and external factors should be empirically defined in order to gain more granularity in positive experiences with products.

Acknowledgments

I gratefully acknowledge the valuable comments of Marcel Cobussen and the anonymous reviewers during the review process of the paper. Many thanks to my colleagues Pieter Desmet and Paul Hekkert for the inspiring discussions during the preparation of the paper and Bryan Howell for his help with the copy-editing.

Figure 6a (left) and Figure 6b (right): Nespresso advertising incorporates images of both appetising Nespresso coffee (on the left) and potentially desirable experiences (on the right).

 

AudioObject 2: The sound of a Nespresso coffee machine

References

Abe, K., K. Ozawa, Y. Suzuki and T. Sone (1999). “The Effects of Visual Information on the Impression of Environmental Sounds.” Inter-Noise 99: 1177–1182.

 

Andringa, Tjeerd C. and J. Jolie L. Lanser (2013). “How Pleasant Sounds Promote and Annoying Sounds Impede Health: A Cognitive Approach.” International Journal of Environmental Research and Public Health 10: 1439-1461.

 

Asutay, Ercan and Daniel Västfjäll (2012). “Perception of Loudness is Influenced by Emotion.” PLoS ONE 7/6: e38660.

 

Aures, W. (1985). “Berechnungsverfahren für den sensorischen Wohlklang beliebiger Schallsignale.” Acustica 59: 130-141.

 

Bar, Moshe (2004). “Visual objects in context.” Nature Reviews: Neuroscience 5/8: 617–629.

 

Batra, Rajeev and Olli T. Ahtola (1991). “Measuring The Hedonic And Utilitarian Sources Of Consumer Attitudes.” Marketing Letters 2: 159-170.

 

Berglund, B. and T. Lindvall (1995). “Community Noise.” Archives of the Center for Sensory Research 2/1: 1-195.

 

Bergman, Penny, Anders Sköld, Daniel Västfjäll, and Niklas Fransson (2009). “Perceptual and Emotional Categorization of Sound.” Journal of Acoustical Society of America 126/6: 3156–3167.

 

Bijsterveld, Karen (2008). Mechanical Sound. Technology, Culture, and Public Problems of Noise in the Twentieth Century. Cambridge: The MIT Press.

 

Bisping, Rudolf (1997). “Car Interior Sound Quality: Experimental Analysis by Synthesis.” Acustica 83/5: 813-818.

 

Björk, E. A. (1985). “The Perceived Quality of Natural Sounds.” Acustica 57/3: 185-188.

 

Bradley Margaret M. and Peter J. Lang (2000). “Affective Reactions to Acoustic Stimuli.” Psychophysiology 37: 204–215.

 

Chion, Michel (1994). Audio-Vision: Sound on Screen. New York: Columbia University Press.

 

Cox, Trevor J. (2008). “The Effect of Visual Stimuli on The Horribleness of Awful Sounds.” Applied Acoustics 69/8: 691–703.

 

Cummings, A., R. Ceponiene, A. Koyama, A. P. Saygin, J. Townsend and F. Dick (2006). “Auditory Semantic Networks for Words and Natural Sounds.” Brain Research 1115/1: 92-107.

 

Desmet, Pieter M. A. (2002). “Designing Emotions” (Doctoral dissertation). Delft: Delft University of Technology.

 

Desmet, Pieter M. A. (2012). “Faces of Product Pleasure: 25 Positive Emotions in Human-Product Interactions.” International Journal of Design 6/2: 1-29.

 

Desmet, Pieter and Paul Hekkert (2007). “Framework of Product Experience.” International Journal of Design 1/1: 57-66.

 

Dess, Nancy K. and David Edelheit (1998). “The Bitter with the Sweet: The taste/stress/temperament nexus.” Biological Psychology 48: 103-119.

 

Edworthy, Judy, Elizabeth Hellier and Rachael Hards (1995). “The Semantic Associations of Acoustic Parameter Commonly Used in the Design of Auditory Information and Warning Signals.” Ergonomics 38/11: 2341-2361.

 

Fastl, Hugo (2004). “Audio-Visual Interactions in Loudness Evaluation.” In S. Ueha (ed.), Proceedings of the 18th International Congress on Acoustics (pp. 1161-1166). Kyoto, Japan.

 

Fitzsimons, Gráinne M., Tanya L. Chartrand and Gavan J. Fitzsimons (2008). “Automatic Effects of Brand Exposure on Motivated Behavior: How Apple Makes You “Think Different.” Journal of Consumer Research 35/1: 21-35.

 

Frijda, Nico H. (1986). The Emotions. Cambridge: Cambridge University Press.

 

Gaver, William W. (1993). “What in the World Do We Hear? An Ecological Approach to Auditory Event Perception.” Ecological Psychology 5/1: 1-29.

 

Gross, James J. and Lisa F. Barrett (2011). “Emotion Generation and Emotion Regulation: One or Two Depends on Your Point of View.” Emotion Review 3: 8-16.

 

Halpern, D. Lynn, Randolph Blake and James Hillenbrand (1986). “Psychoacoustics of a Chilling Sound.” Perception and Psychophysics 39/2: 77-80.

 

Hashimoto, T. and S. Hatano (2001). “Effects of Factors other than Sound to the Perception of Sound Quality.” In A. Alippi (ed.), Proceedings of the 17th International Congress on Acoustics. Rome, Italy. 

 

Haas, Ellen and Judy Edworthy (2006). “An Introduction to Auditory Warnings and Alarms.” In M. S. Wogalter (ed.), Handbook of Warnings (pp. 189-198). Mahwah, NJ: Lawrence Erlbaum Associates.

 

Hassenzahl, Marc (2004). “The Interplay of Beauty, Goodness and Usability in Interactive Products.” Human Computer Interaction 19: 319–349.

 

Hekkert, Paul and Helmut Leder (2008). “Product Aesthetics.” In H. N. J. Schifferstein and P. Hekkert (eds.), Product Experience (pp. 259-286). Amsterdam: Elsevier.

 

Jordan, Patrick W. (2000). Designing Pleasurable Products. London: Taylor and Francis.

 

Klatzky, Roberta L., Dinesh K. Pai and Eric Krotkov (2000). “Perception of Material from Contact Sounds.” Presence 9/4: 399-410.

 

Kubovy, Michael and van David Van Valkenburg (2001). “Auditory and Visual Objects.” Cognition 80/1–2: 97-126.

 

Kuppens, Peter, Dominique Champagne and Francis Tuerlinckx (2012). “The Dynamic Interplay between Appraisal and Core Affect in Daily Life.” Frontiers in Psychology 3/380: 1-9.

 

Lageat, Thierry, Sandor Czellar and Gilles Laurent (2003). “Engineering Hedonic Attributes to Generate Perceptions of Luxury: Consumer Perception of an Everyday Sound.” Marketing Letters 14/2: 97-109.

 

Laurans, Gaël, Pieter M. A. Desmet and Paul Hekkert (2009). “The Emotion Slider: A Self-Report Device for the Continuous Measurement of Emotion.” In Proceedings of the International Conference on Affective Computing and Intelligent Interaction (pp. 408-412). Amsterdam, The Netherlands.

 

Marcell, Michael E., Diane Borella, Michael Greene, Elizabeth Kerr and Summer Rogers (2000). “Confrontation Naming of Environmental Sounds.” Journal of Clinical and Experimental Neuropsychology 22/6: 830-864.

 

Maris, Eveline (2008). “The Social Side of Noise Annoyance” (Doctoral dissertation). Leiden: University of Leiden.

 

McAdams, Stephen (1993). “Recognition of Sound Sources and Events.” In A. S. Adams and E. Bigand (eds.), Thinking in Sound: The Cognitive Psychology of Human Audition (pp. 146 -198). New York: Oxford University Press.

 

McDermott, Josh H. (2011). “Auditory Preferences and Aesthetics: Music, voices, and Everyday Sounds.” In R. J. Sharot and T. Dolan (eds.), The Neuroscience of Preference and Choice (pp. 227-296). London: Elsevier.

 

Moshagen, Morten, Jochen Musch and Anja Göritz (2009). “A Blessing, not a Curse: Experimental Evidence for Beneficial Effects of Visual Aesthetics on Performance.” Ergonomics 52: 1311-1320.

 

Norman, Donald A. (2004). Emotional Design. New York, NY: Basic Books.

 

Orgs, Guido, Kathrin Lange, Jan-Henryk Dombrowski and Martin Heil (2006). “Conceptual Priming for Environmental Sounds and Words: An ERP Study.” Brain and Cognition 62/3: 267–272.

 

Özcan, Elif (2008). “Product Sounds: Fundamentals and Application” (Doctoral dissertation). Delft: Delft University of Technology.

 

Özcan, Elif and René van Egmond (2007). “Memory for Product Sounds: The Effect of Sound and Label Type.” Acta Psychologica 126/3: 196-215.

 

Özcan, Elif and René van Egmond (2009). “The Effect of Visual Context on the Identification of Ambiguous Environmental Sounds.” Acta Psychologica 131/2: 110-119.

 

Özcan, Elif and René van Egmond (2012). “Basic Semantics of Product Sounds.” International Journal of Design 6/2: 41-54.

 

Pollock, R. A., A. S. Carter, N. Amir and L. E. Marks (2006). “Anxiety Sensitivity and Auditory Perception of Heartbeat.” Behavior Research and Therapy 44: 1739-1756.

 

Russell, James A. (2003). “Core Affect and the Psychological Construction of Emotion.” Psychological Review 110: 145-72.

 

Russell, James A. (1980). “Circumplex Model of Affect.” Journal of Personality and Social Psychology 39: 1161-1178.

 

Saygin, Ayşe P., Frederic Dick and Elizabeth Bates (2001). “Linguistic and Nonlinguistic Auditory Processing in Aphasia.” Brain and Language 79/1: 143-145.

 

Siegel, Erika H. and Jeanine K. Stefanucci (2011). “A Little Bit Louder Now: Negative Affect Increases Perceived Loudness.” Emotion 11: 1006–1011. 

 

Spangenberg, Eric R., Kevin E. Voss and Ayn E. Crowley (1997). “Measuring the Hedonic and Utilitarian Dimensions of Attitude: A Generally Applicable Scale.” Advances in Consumer Research 24: 235-241.

 

Spence, Charles and Massimiliano Zampini (2006). “Auditory Contributions to Multisensory Product Perception.” Acta Acustica 92/6: 1009-1025.

 

Spreckelmeyer, Katja N., Marta Kutas, Thomas P. Urbach, Eckart Altenmüller and Thomas F. Münte (2006). “Combined Perception of Emotion in Pictures and Musical Sounds.” Brain Research 1070/1: 160-170.

 

Sleeswijk-Visser, Froukje, Pieter J. Stappers, Remko van der Lugt and Elizabeth B. N. Sanders (2005). “Contextmapping: Experiences from Practice.” CoDesign 1/2: 119-149.

 

Susini, Patrick, Stephen McAdams, Suzanne Winsberg, Ivan Perry, Sandrine Viellard and Xavier Rodet (2004). “Characterizing the Sound Quality of Air-Conditioning Noise.” Applied Acoustics 65/8: 763-790.

 

Tractinsky, Noam, A. S. Katz  and D. Ikar (2000). “What is Beautiful is Usable.” Interacting with Computers 13: 127–145.

 

Tuuri, Kai, Manne-Sakari Mustonen and Antti Pirhonen (2007). “Same sound – Different Meanings: A Novel Scheme for Modes of Listening.” In Proceedings of Audio Mostly (pp. 13-18). Ilmenau: Fraunhofer Institute for Digital Media Technology. 

 

Van Balken, Johan (2002). “Designing Coffeemaker Sounds. Exploring the Experience of Senseo's Sound” (Master’s thesis). Delft: Delft University of Technology.

 

Van Egmond, René (2008). “The Experience of Product Sounds.” In H. N. J. Schifferstein and P. Hekkert (eds.), Product Experience (pp. 69-89). Amsterdam: Elsevier.

 

Vanderveer, Nancy J. (1979). “Confusion Errors in Identification of Environmental Sounds.” Journal of the Acoustical Society of America 65/1: S60.

 

Västfjäll, Daniel, Mehmet Ali Gulbol, Mendel Kleiner and Tommy Gärling (2002). “Affective Evaluations of and Reactions to Exterior and Interior Vehicle Auditory Quality.” Journal of Sound and Vibration 255/3: 501-518.

 

Västfjäll, Daniel, Mendel Kleiner and Tommy Gärling (2003). “Affective Reactions to Interior Aircraft Sounds.” Acta Acustica 89/4: 693-701.

 

Von Bismarck, G. (1974). “Timbre of Steady Sounds: A Factorial Investigation of its Verbal Attributes.” Acustica 30: 146-159.

 

Wang, Jade, Trent Nicol, Erika Skoe, Mikko Sams and Nina Kraus (2008). “Emotion Modulates Early Auditory Response to Speech.” Journal of Cognitive Neuroscience 21: 2121-2128.

 

Zwicker, Eberhard, and Hugo Fastl (1990). Psychoacoustics: Facts and Models. Berlin: Springer.