BJ Fogg

from Persuasive Technology: Using Computers to Change What We Think and Do

Week 08 — Additional

Persuasive8 Technology Using Computers to Change What We Think and Do

B.J. FOGG

Preface

When I was 10 years old, I studied propaganda. Each week my fifth grade class would meet with a teaching intern from Fresno State University. He showed us how the media and politicians use techniques to change what people think and do. I learned names for the various propaganda techniques, and I could soon identify them in magazine ads and TV commercials. I felt empowered.

I thought it strange to be learning about propaganda in a rural bungalow classroom surrounded by fig orchards, but I also found the topic fascinating. I marveled at how words, images, and songs could get people to donate blood, buy new cars, or join the Army.

This was my first formal introduction to persuasion. After that, everywhere I looked I started seeing what I called “propaganda,” used for good purposes and bad.

While my interest in persuasion was growing, so was my exposure to technology, thanks to my father. In the late 1960s, we got a phone call at home. It was Dad. “I’m driving down the street now,” he said. “I’ll be home in about one minute.” He’d installed some sort of phone in his car, obviously well ahead of the curve. Later, an enormous microwave oven would find a home in our garage (the only place where the beast would fit). Soon we’d enjoy a device that could display images on our TV; we’d sit as a family and watch the eye surgeries Dad had performed. Later, before computer systems were commercially available, Dad built his own computer with parts he’d ordered, spending many evenings soldering computer chips onto a circuit board.

xxiii

xxiv ■ Persuasive Technology

It seems strange now, but it was not at all unusual for my parents to take vacations to Las Vegas with their Fresno friends. The purpose wasn’t gambling; instead, they made their annual pilgrimage to the Consumer Electronics Show (CES) to experience the latest and greatest in the world of consumer technology. Sometimes they would take a few of their seven children with them. For me, this was just about the best vacation ever. At that time I did not suspect that someday I would be paid to participate in CES and similar trade shows.

My early exposure to persuasive techniques and technology clearly shaped my interests. After a long career (seven years) as an undergraduate, studying most anything that struck my fancy, I discovered an area that pulled together the interests I’d been developing since I was a child. The area was document design (now more widely known as information design). As described by Karen Schriver,[1] a leading thinker on the topic of communicating information, document design was all about making information “accessible, usable, and persuasive.”

I was enthralled by the topic and devoured everything I could find about it, from the readability of fonts, to models of text structure, to conceptual arguments about programmed instruction. With PageMaker 1.0 as my partner, I started a company named “Avatar” to provide document design services to software companies, direct marketing firms, and anyone else who needed better ways to inform and persuade an audience.

While completing my master’s degree in 1992, I created a document design curriculum and taught honors students in what I still believe was the first undergraduate course ever in information design. In the two years that I taught this course, as my students and I explored how to make documents “accessible, usable, and persuasive,” it became clear to me that my real interest was in the third aspect: persuasion.

I could see that the future of information design, specifically in creating artifacts to persuade people, lay in digital technology, in online environments, and in interactive computing products. So, with the vision of understanding how computer systems can be designed to persuade people, I began my doctoral work at Stanford University, in the process becoming a social scientist—specifically an experimental psychologist.

  1. The article by Karen Schriver that brought together so many of my interests and gave this area a name was Document design from 1980 to 1989: Challenges that remain, Technical Communication, 36(4), 316–331 (1989).

Later, Dr. Schriver published a book in this area: Dynamics in Document Design: Creating Texts for Readers (New York: Wiley, 1996).

Preface ■ xxv

To my surprise, after searching through the literature and asking thought leaders in related areas such as psychology, human-computer interactions (HCI), and marketing, I concluded that no one had yet paid special attention to the role of computers in persuasion. A few pioneering products existed, but there was no examination of the potentials and pitfalls of computer systems created to change people’s attitudes and behaviors.

My doctoral thesis would examine how computers could be persuasive. I titled my dissertation “Charismatic Computers.” It included experimental studies on how to make computers more likable and persuasive, as well as outlining a vision for a new domain that I referred to as “captology,” an acronym based on the phrase computers as persuasive technologies . My vision of captology has inevitably deepened and expanded over the years as technology has evolved and I’ve learned more about the ways in which computers can influence people.

Persuasive technology is a fast-growing area of research and development. Computing systems of many types, from Web sites to productivity applications to mobile devices, are becoming increasingly focused on motivating and influencing users.

One of the assertions in this book is that in the future we’ll see more and more computing products designed for the primary purpose of persuasion. In addition, software applications—desktop or Web-based—designed mainly for other purposes (such as productivity, creativity, or collaboration) will increasingly incorporate elements of persuasion, ideally motivating users to make better use of the application and supporting them in achieving their goals.

In my view, it will become important for most people designing end-user computing products to understand how principles of motivation and influence can be designed into interactive experiences with computers. As end-user computing matures, understanding captology may become as important as understanding usability.

For the past nine years at Stanford, I’ve been investigating how interactive technologies can change people’s attitudes and behaviors. Although captology is still an emerging area of research and design, the time has come to share this work with a larger audience and to bring some order to this domain. The purpose of this book is precisely that: to lay the groundwork for better understanding current and future persuasive technologies.

My goal in writing this book is to provide insight into the current state of computers and persuasion and foresight into the likely future of persuasive technology. The book was written primarily for professionals interested in researching, analyzing, or designing persuasive technologies. But it is not a

xxvi ■ Persuasive Technology

technical book, and based on my teaching experience at Stanford, I believe it has relevance for a broad range of readers. That includes technology watchers as well as executives who want to understand how they might use persuasive technology to develop new products, win new customers and markets, or strengthen brand identity and loyalty.

I hope that all readers will appreciate the importance of ethics in creating persuasive technology products. I’ve devoted a chapter of the book to this topic. In my view, the evolution of persuasive technology systems should not be left to accident or to market forces alone. The power to persuade via computing systems comes with the responsibility to use the technology for appropriate, ethical ends. This is my ultimate hope for this book: that it will contribute to the responsible design and application of persuasive technology.

Acknowledgments

Many people contributed invaluable help and support for this book. First, I want to acknowledge my debt to all of those who, over the years, have helped me design studies and conduct research relating to captology. I could not have completed this research—or this book—without your help. Thank you.

The four people on my dissertation committee at Stanford University deserve special acknowledgment. Each of them continued to support me long after my thesis was signed and delivered. Special thanks to Clifford Nass for advising me throughout my doctoral work during the mid-1990s. Without his influence and help, this book would not exist. Thanks also to Byron Reeves, for teaching me about quantitative research methods and for being an advocate of my work since my arrival at Stanford. In the area of human-computer interaction, I feel fortunate to have worked with Terry Winograd off and on over the years. I still strive to make good on the confidence Terry has placed in me and my work. In the area of psychology, I’m deeply grateful to Phil Zimbardo for seeing the potential for persuasive technology, for making time in his hectic schedule to advise me, for giving me the first opportunity to test my material in a classroom — and for being the first to encourage me to write a book on captology.

Others at Stanford have also played a key role in supporting my work and in making this book a reality. I thank John Perry for making a personal sacrifice to give my lab physical space at Stanford’s CSLI. Thanks also to Decker Walker for advocating that I teach graduate students in Stanford’s Learning, Design, and

Preface ■ xxvii

Technology Program. And many thanks to CSLI’s Michele King for helping me on innumerable tasks, big and small, making my life easier in the process.

Research done in my Stanford lab has given this book greater richness. I’m grateful to the dozens of people who have worked with me over the years. I owe a special debt of gratitude to those students who joined my lab at the beginning, showing confidence in captology and in me. These people helped to shape my thinking and contributed many unpaid and uncredited hours to the cause: Daniel Berdichevsky, John Bruck, Jared Kopf, Erik Neuenschwander, Jason Tester, and Shawn Tseng. Researchers who subsequently joined the lab have also expanded our understanding of persuasive technology. These people include John Boyd, Tami Kameda, Jonathan Marshall, Rupa Patel, Josh Solomon, Peter Westen, Peter Dodd, and Nina Kim, among others. Those who commented on early drafts of the book or contributed directly to the research you’ll find here also deserve acknowledgment. They include Meghann Evershed, George Slavich, Cathy Soohoo, Tracy Trowbridge, Ling Kong, Akshay Rangnekar, Johnnie Manzari, Katherine Woo, Kim Rice, Phil King, and Ramit Sethi.

In my industry work, I was fortunate to interact with many people who supported my research in captology. I am especially grateful to Janice Bradford (H-P), Debby Hindus (Interval Research), Bob Glass (Sun Microsystems), and Rick Kinoshita (Casio U.S. R&D). Other friends and colleagues in industry have helped to sharpen my thinking and improve the quality of this book, including Andy Cargile, Denise Caruso, Peter and Trudy Johnson-Lenz, John Lilly, Brenda Laurel, Jakob Nielsen, Youngme Moon, and Nathan Shedroff.

Thanks to various reviewers who made time to comment on different versions of my manuscript, some of whom remained anonymous. There is one person on the review team to whom I am enormously grateful: Chauncey Wilson. His critiques and suggestions were excellent; this is a much better book thanks to him.

I wish to acknowledge my editor at Morgan Kaufman Publishers, Diane Cerra, for recognizing the importance of captology and for championing my book. Thanks, Diane!

No person contributed more to this book in its final phases than did my developmental editor, Jeannine Drew. Although the book’s content is attributed to me as the author, evidence of Jeannine’s expert assistance appears on every page. As we worked together, she offered challenging critiques of my ideas and examples, created solutions to difficult problems, and never faltered in any of her commitments. Her input and guidance have made this book better in so many ways. I look forward to the next opportunity to work with Jeannine.

xxviii ■ Persuasive Technology

The people I’ve mentioned by name above represent only a partial list of those to whom I am grateful. So many colleagues, professors, students, clients, and friends have made an impact on my thinking, my research, and this book. Any shortcomings you may find in these pages are not due to those who offered help and support to me, but are solely my responsibility.

Finally, thanks to my family for being patient about the many evenings, weekends, and holidays I’ve devoted to researching and writing this book rather than being with them. My deep appreciation to my parents, Gary and Cheryl, and to Linda, Steve, Mike, Kim, and Becky and their families. I am also extremely grateful to Denny for clearing my path so I could make progress on my many commitments, including this book.

Introduction Persuasion in the Digital Age

Computers weren’t initially created to persuade; they were built for handling data—calculating, storing, and retrieving. But as computers have migrated from research labs onto desktops and into everyday life, they have become more persuasive by design. Today computers are taking on a variety of roles as persuaders, including roles of influence that traditionally were filled by teachers, coaches, clergy, therapists, doctors, and salespeople, among others. We have entered an era of persuasive technology, of interactive computing systems designed to change people’s attitudes and behaviors.

I define persuasive technology as any interactive computing system designed to change people’s attitudes or behaviors.

The earliest signs of persuasive technology appeared in the 1970s and 1980s, when a few computing systems were designed to promote health and increase workplace productivity. One of the earliest examples is a computer system named Body Awareness Resource Network (BARN), developed in the late 1970s. This pioneering program was designed to teach adolescents about health issues such as smoking, drugs, exercise, and more, with an ultimate focus on enhancing teens’ behaviors in these areas.[1] Gradually other interactive programs of this nature followed, most designed to address adolescent health issues or to treat psychological disorders.[2] But it wasn’t until the late 1990s—specifically, the emergence of the Internet—that more than a handful of people began creating persuasive technology.

1

2 ■ Persuasive Technology

Persuasion on the Web

The emergence of the Internet has led to a proliferation of Web sites designed to persuade or motivate people to change their attitudes and behavior. Web sites are the most common form of persuasive technology today. Consider a few examples:

  • Amazon.com doesn’t just process orders; it attempts to persuade people to purchase more products. It does so by offering suggestions based on user preferences gathered during previous visits and feedback from others who ordered the product, and by presenting compelling promotions, such as the Gold Box offers and the “Share the Love” program.

  • Iwon.com wants visitors to make it their default search engine and awards prizes to persuade them to do so.

  • Classmates.com, the leading online service for reuniting people, successfully leverages social influence principles (a topic discussed in Chapters 5 and 8) to persuade people to give up their personal information, from maiden name to birth year. In some cases, the site is able to persuade people to post personal histories and recent photographs online.

  • The New York Times online tries to persuade readers to give up their personal information, including household income, when they sign up for the free online version of the newspaper.

  • The auction site eBay has developed an online exchange system with sufficient credibility (a topic discussed in Chapters 6 and 7) that users are persuaded to make financial transactions, big and small, with strangers who have screen names like “punnkinhead” and “bodyheat2.”

Beyond the Web

Beyond the Web, persuasive technology can take on many forms, from mobile phones to “smart” toothbrushes to the computerized trailers that sit by the roadside and post the speed of passing cars in an attempt to persuade drivers to abide by the speed limit. In some cases, the technology may not even be visible to the user. With the emergence of embedded computing, the forms of persua-

Introduction: Persuasion in the Digital Age3

sive technology will likely become more diverse, “invisible,” and better integrated into everyday life. The Web, which is so prominent today, will be just one of many forms of persuasive technology within another 10 years.

The uses for persuasive technology also will expand in the coming decade, extending far beyond the primary applications we see today, such as advertising, marketing, and sales. At work, persuasive technology might be used to motivate teams to set goals and meet deadlines. At home, it could encourage kids to develop better study habits. In civic life, it could persuade people to vote on election day. Wherever the need for persuasion exists, I believe that interactive technology can play a role.

Throughout this book, you’ll see plenty of examples of current and emerging persuasive technology applications. Table 1 suggests some of the domains and potential applications. Some of these examples are explored in more detail in later chapters.

Table 1 Persuasive Technology: Domains and Applications

Domain Example application Persuades users to
Commerce Amazon.com’s recom- Buy more books and
mendation system other products
Education, learn- CodeWarriorU.com Engage in activities that
ing, and training promote learning how to
write code
Safety Drunk driving Avoid driving under the
simulator influence of alcohol
Environmental Scorecard.org Take action against
preservation organizations that
pollute
Occupational “In My Steps” VR Treat cancer patients
effectiveness system with more empathy
Preventive Quitnet.com Quit smoking
healthcare
Fitness Tectrix VR bike Exercise and enjoy it
continued

4 ■ Persuasive Technology

Table 1 Continued

Domain Example application Persuades users to
Disease Bronki the bronchia- Manage asthma more
management saurus game effectively
Personal finance FinancialEngines.com Create and adhere to a
retirement plan
Community CapitolAdvantage.com Get ordinary citizens in-
involvement/ volved in public affairs
activism
Personal Classmates.com Reconnect with former
relationships classmates
Personal manage- MyGoals.com Set goals and take the
ment and self- needed steps to achieve
improvement them

We are still in the early stages of persuasive technology development. The potential for using (or, unfortunately, abusing) such technology is enormous. Those who are early to understand this emerging field will be in the best position to benefit from it, personally and professionally. By understanding the ideas in this book, readers will be in a better position to

  • Recognize when Web sites and computing products are designed to influence people

  • Identify the persuasion strategies these interactive systems use

  • Understand the dynamics behind the persuasive elements in Web sites and other products

  • Identify new opportunities for influence in computing systems

  • Create interactive experiences that motivate and persuade people

  • Address the ethical issues of persuading via computing systems

  • Predict what the future holds for persuasion via computing products

Introduction: Persuasion in the Digital Age5

Figure 1

Captology describes the area where computing technology and persuasion overlap.

==> picture [219 x 143] intentionally omitted <==

----- Start of picture text -----
Computers Persuasion
Web sites
Mobile phones Behavior change
PDAs
Video games Attitude change
Desktop software
Chat bots Motivation
Smart environments
Virtual reality Change in worldview
Exercise equipment
Specialized devices Compliance
Kiosks
Captology
----- End of picture text -----

The Emergence of “Captology”

The study of computers as persuasive technologies is relatively new. As noted in the preface, to describe this emerging area, I coined the term “captology”— an acronym based on the phrase “computers as persuasive technologies.” Briefly stated, captology focuses on the design, research, and analysis of interactive computing products created for the purpose of changing people’s attitudes or behaviors. It describes the area where technology and persuasion overlap (Figure 1).

Potential and Pitfalls

Interactivity gives computing technology a strong advantage over other persuasive media.

When I first began sharing my experimental research on computers and persuasion, I received radically different responses. Some colleagues became upset over the potential misuses of persuasive technology. Some, in peer reviews and at conferences, even declared my research immoral. Other people were excited about the potential of persuasive technology for marketing and sales: rather than using static media or costly human beings, these people glimpsed how computing technology could grow their businesses. Still others saw the potential for applying persuasive technology to promote positive social goals, such as preventing teen pregnancy or reducing world hunger.

6 ■ Persuasive Technology

Both positive and negative reactions to captology have merit. Perhaps more than anyone else, I’ve investigated both the potential and the pitfalls of persuasive technology. Although I don’t find captology immoral, I acknowledge that persuasive technology can be used in unethical ways in an attempt to change people’s attitudes and behaviors. For example, an online game could be used to persuade children to give up personal information.

In this book I focus primarily on the positive, ethical applications of persuasive technology. But I also highlight the pitfalls, and I explore the ethics of such technology in Chapter 9.

Advantage over Traditional Media: Interactivity

Traditional media, from bumper stickers to radio spots, from print ads to television commercials, have long been used to influence people to change their attitudes or behaviors. What’s different about computers and persuasion? The answer, in a word, is interactivity.

As a general rule, persuasion techniques are most effective when they are interactive, when persuaders adjust their influence tactics as the situation evolves. Skilled salespeople know this and adjust their pitches according to feedback from the prospect.

Persuasive technologies can adjust what they do based on user inputs, needs, and situations. An interactive program to help someone quit smoking can tailor its approach to how much the person smokes (physical addiction) and address the often-powerful psychological issues (psychological addiction) that compel the person to smoke. Over time, as the person reports progress or failures, the system can use its knowledge about the smoker’s demographic variables as well as physical and psychological addiction issues to make suggestions (such as alternatives to smoking when the urge is strong), lead the person through activities (such as interactive scenarios), or provide the right kind of encouragement to help the person quit. Traditional media cannot easily deliver such a tailored program.

Today computer technology is being designed to apply traditional human techniques of interactive persuasion, to extend the reach of humans as interactive persuaders. This is new territory, both for computing technology and for human beings.

Introduction: Persuasion in the Digital Age7

Advantages over Human Persuaders

When it comes to persuasion, computers not only have an advantage over traditional media. They also have six distinct advantages over human persuaders. Specifically, they can do the following:

  1. Be more persistent than human beings

  2. Offer greater anonymity

  3. Manage huge volumes of data

  4. Use many modalities to influence

  5. Scale easily

  6. Go where humans cannot go or may not be welcome

1. Computers Are Persistent

You’ve probably experienced how some software registration programs persist in asking you to register. If you don’t register at installation, from time to time the program reminds you—or nags you—to share your personal information (Figure 2). Not everyone does, of course, but the persistent reminders undoubtedly increase the rate of registration. People get tired of saying no; everyone has a moment of weakness when it’s easier to comply than to resist.

Figure 2

Eudora registration screen.

8 ■ Persuasive Technology

No human can be as persistent as a machine. Computers don’t get tired, discouraged, or frustrated. They don’t need to eat or sleep. They can work around the clock in active efforts to persuade, or watch and wait for the right moment to intervene. As the software registration example suggests, when it comes to persuasion, this higher level of persistence can pay off.

2. Computers Allow Anonymity

Another advantage computers have in persuasion is that they allow anonymity. The option of remaining anonymous is important in sensitive areas such as sexual behavior, substance abuse, or psychological problems.[3] It’s often easier (and less embarrassing) to get information or help anonymously, via an interactive computing program, than it is to face another human being.

Anonymity also is important when people are experimenting with new attitudes and behaviors. You may have sensed this phenomenon in anonymous chat rooms: shy people can try being bold, those with conservative values can test liberal waters, and those who normally guard their privacy can open up and speak their minds. For better and for worse, anonymity helps overcome social forces that lock people into ruts and routines.[4] At times anonymity makes it easier for people to change.

3. Computers Can Store, Access, and Manipulate Huge Volumes of Data

Another advantage: Computers can store, access, and manipulate large quantities of data, far beyond the capabilities of human beings. This gives interactive technology the potential to be more persuasive than human beings.

In some situations, the sheer quantity of information presented will change what people believe and perhaps what they do.[5] In such situations, the computer’s ability to draw on a vast storehouse of information will give it greater powers of persuasion. In other cases, the computer’s ability to find and present precisely the right fact, statistic, or reference from that volume of data can help to persuade more effectively than a human could.

The ability of computers to access and manipulate large volumes of information also enables them to make suggestions—another form of persuasion (I’ll discuss suggestion technology in Chapter 3). Using collaborative filtering or Bayesian networks—automated methods for making inferences—comput-

Introduction: Persuasion in the Digital Age9

ers can predict what a user is likely to buy or do and make recommendations to the user based on that. Sometimes I go to Amazon to buy a single CD and end up buying a few different titles because the site made excellent recommendations. (To me, getting targeted recommendations feels like a service, not a hard sell, but others may find this “service” intrusive.)

4. Computers Can Use Many Modalities

Often people are influenced not by information itself but by how it’s presented—the modality. Human beings can convey information in many modes, but we cannot match the variety of modes available to a computing system.

To persuade, computers can present data and graphics, rich audio and video, animation, simulation, or hyperlinked content.[6] The ability to use various modalities enables technology to match people’s preferences for visual, audio, or textual experiences. Technology can also create a synergistic effect by combining modes, such as audio, video, and data, during an interaction to produce the optimum persuasive impact.

One example of combining computing modalities emerged in the wake of the terrorist attacks of September 11, 2001. In the days that followed, as the United States and other countries debated how to respond, Alternet.org, whose mission is to “engage our community of readers in problem solving, community action and awareness of current events in the United States and abroad,” created a Web-based experience to affect this response.[7] In seeking to persuade, the creators drew on at least three modalities available to computers. First, most people learned about the site through a text email from a friend, which included a link to the Web page.[8] Once at the site, the user saw an animation unfold, combining moving text, images, and a soundtrack. At the conclusion of this minute-long animation, the text read, “Urge President Bush to exercise sober restraint in responding. Click here.” When users clicked the button, they were presented with a specific call to action and a template email they could modify and send to the White House.

Computing technology is the only media that could combine this range of modalities into a seamless experience, starting with a friend’s email, leading to an emotionally charged animation, and ending with the means to take immediate action on the issue. That’s the power of leveraging modalities to persuade.

The computer-based intervention “Alcohol 101” provides another example of how persuasive technology can leverage multiple modalities.[9] As first-year college students use this product to explore the negative consequences of

10 ■ Persuasive Technology

excessive drinking at college parties, they find many types of experiences: interactive stories, TV-like video footage, simulations that calculate blood alcohol content, an interactive text-based game, and more. It’s a rich interactive product, with something that is likely both to appeal to the wide range of people who use the product and to affect their attitudes and behavior.

5. Computer Software Can Scale

The next advantage technology has over human persuaders is the ability to scale—to grow quickly when demand increases. If a human persuader is effective, it’s difficult to scale the experience so that it reaches millions of people around the world quickly. How can you replicate a top sales rep, an influential personal trainer, or a charismatic religious figure? You can increase the person’s scope of influence through print, audio, or video communications, but the original experience may get lost along the way, particularly if the original experience was interactive.

By contrast, when it comes to software-based experiences—especially those delivered over the Internet—the ability to scale is relatively easy. You can replicate and distribute persuasive technology experiences that work just like the original.

6. Computers Can Be Ubiquitous

The final advantage that technology has over human persuaders is ubiquity— the ability to be almost everywhere. With the growth of embedded computers, computing applications are becoming commonplace in locations where human persuaders would not be welcomed, such as the bathroom or bedroom, or where humans cannot go (inside clothing, embedded in an automotive system, or implanted in a toothbrush).

When interactive computing systems are embedded in everyday objects and environments, they can intervene at precisely the right time and place, giving them greater persuasive power. (Chapters 3 and 8 address the persuasive impact of intervening at the right time and place.) Rather than having parents nag their kids to brush their teeth, a smart toothbrush could help motivate kids to do the job by reminding them at the appropriate time and place. Likewise, an

Introduction: Persuasion in the Digital Age11

embedded car system can be more effective than a classroom discussion in promoting safe driving, by intervening at just the right moments, such as after a reckless driver has barely avoided an accident. The system might sense the driver has slammed on the brakes, didn’t use a turn signal, or otherwise was negligent, and communicate to the driver via an audio signal, verbal message, or other means.

With the rise of ubiquitous computing, we’ll see a growing number of technologies that attempt to motivate and influence. In the coming years we are likely to see computers playing new persuasive roles in promoting health, safety, and eco-friendly behavior, in addition to selling products and services (the most frequent application of persuasive technology today).

How to Read This Book

In the following chapters, I will provide frameworks and principles for understanding persuasive technology. Along the way, I’ll discuss studies I’ve conducted at Stanford, as well as share many examples of computing products— from Web sites to mobile systems—designed to change what people think and do. I’ll also outline possibilities for new types of persuasive technologies.

The plan of the book is straightforward: The first five chapters lay the groundwork for understanding captology. Subsequent chapters address computer credibility, Web credibility, mobile and networked persuasion, and ethics. The last chapter provides a glimpse into the future of persuasive technology.

Throughout this book, my goal is to provide understanding and insight and some general “how to” guidelines. Whether you are a designer, researcher, or user of persuasive technology, I’m confident you can apply the insights offered here to your own work and life. By providing a framework for understanding persuasive technology and for designing responsible, ethical applications, it’s my hope that I can help others to leverage the power of technology to improve the lives of individuals and communities.

The field of captology is evolving. With that in mind, I have established a Web site, www.persuasivetech.info, where readers can go to find the latest information about this emerging area. At the site I’ll also post errata sheets for this book, as well as comments, corrections, and suggestions from readers. I would welcome your feedback.

12 ■ Persuasive Technology

Notes and References

For updates on the topics presented in this chapter, visit www.persuasivetech.info.

  1. K. Bosworth, D. H. Gustafson, R. P. Hawkins, B. Chewning, and P. M. Day, BARNY: A computer based health information system for adolescents, Journal of Early Adolescence, 1(3): 315–321 (1981).

  2. For example, see G. Lawrence, Using computers for the treatment of psychological disorders, Computers in Human Behavior, 2(1): 43–62 (1986).

See the following for more on early examples of persuasive technology:

  • a. S. J. Schneider, Trial of an on-line behavioral smoking cessation program, Computers in Human Behavior, 2: 277–286 (1986).

  • b. M. L. Tombari, S. J. Fitzpatrick, and W. Childress, Using computers as contingency managers in self-monitoring interventions: A case study, Computers in Human Behavior, 1: 75–82 (1985).

  • c. J. Woodward, D. Carnine, and L. Davis, Health ways: A computer simulation for problem solving in personal health management, Special issue: Technological advances in community health, Family & Community Health, 9(2): 60–63 (1986).

  • d. C. Muehlenhard, L. Baldwin, W. Bourg, and A. Piper, Helping women “break the ice”: A computer program to help shy women start and maintain conversations with men, Journal of Computer-Based Instruction, 15(1): 7–13 (1988).

  • e. S. W. Demaree, Interactive technology: The greatest sales tool ever invented? Magazine of Bank Administration, 63(1): 16 (1987).

  1. For an example of how anonymity is helpful when dealing with sexual issues, see Y. M. Binik, C. F. Westbury, and D. Servan-Schreiber, Case histories and shorter communications: Interaction with a “sex-expert” system enhances attitudes towards computerized sex therapy, Behavioral Research Therapy, 27: 303–306 (1989).

  2. For one of the early studies showing that people who are anonymous break from their socially induced behaviors, see

P. G. Zimbardo, The human choice: Individuation, reason, and order versus deindividuation, impulse, and chaos, Nebraska Symposium on Motivation, 17: 237–302 (1969).

For recent research on the effects of anonymity in computer environments, see

T. Postmes, R. Spears, K. Sakhel, and D. De Groot, Social influence in computer-mediated communication: The effects of anonymity on group behavior, Personality and Social Psychology Bulletin, 27: 1243–1254 (2001).

Also, see the ongoing work of Dr. Martin Lea, a psychologist at the University of Manchester: http://www.psy.man.ac.uk/staff/mlea/index.htm .

Introduction: Persuasion in the Digital Age13

  1. When people aren’t deeply involved in an issue or are not able to think deeply (in other words, when they are using peripheral processing, rather than central processing), they rely more heavily on the number of arguments in favor of an idea rather than the quality of arguments. This is one of the basic assertions of the Elaboration Likelihood Model. Richard Petty and John Cacioppo have published widely in this area. For example, see

    • R. E. Petty and J. T. Cacioppo, The elaboration likelihood model of persuasion, in L. Berkowitz (ed.), Advances in Experimental Social Psychology (San Diego, CA: Academic Press, 1986), vol. 19, pp. 123–205.
  2. For a discussion of how graphics can be persuasive, see W. King, M. Dent, and E. Miles, The persuasive effect of graphics in computer-mediated communication, Computers in Human Behavior, 7(4): 269–279 (1991).

  3. The online experience related to the events of September 11, 2001, was created by freerangegraphics.com.

  4. You can find the online experience related to the events of September 11, 2001, at http:// www.alternet.org/break_cycle.html.

  5. For information about the Alcohol 101 program, see www.alcohol101.org.

chapter 1

Overview of Captology

Defining Persuasion

Although philosophers and scholars have been examining persuasion for at For purposes of least 2,000 years, not everyone agrees on what the term really means.[1] For pur- captology, persua- poses of captology, I define persuasion as an attempt to change attitudes or sion is defined as behaviors or both (without using coercion or deception). This is a broad defini- the attempt to tion, and one on which many persuasion professionals, such as academic re- change attitudes or searchers, marketers, and clinical psychologists, would agree. It also fits with behaviors or both. how the word is used in everyday life.

It’s important to note the difference between persuasion and coercion, terms that are sometimes confused. Coercion implies force; while it may change behaviors, it is not the same as persuasion, which implies voluntary change—in behavior, attitude, or both.[2]

Similarly, persuasion and deception may be confused. For instance, when I ask my students to find examples of persuasion on the Web, invariably some of them come to class with screen shots of Internet banner ads that report false emergencies (“Your systems resources are low. Click here!”) or that misinform users (“Pornography is downloading to your computer. Click here to stop.”) While such ads might change what people think and do, they do so through deception, not persuasion. Computer-based coercion and deception are topics in their own right,[3] but they are not covered under the umbrella of captology because they do not depend on persuasion.[4]

15

16 ■ Persuasive Technology

Focus on the Human-Computer Relationship

In the premier issue of the academic journal Interacting with Computers, an Captology focuses editorial posed an important question: Do we interact with computers or do on attitude or we interact through them?[5] While a good rhetorician could argue either side of behavior change this question, it seems clear that people interact both with and through com- resulting from puters, depending on the situation. human-computer Captology—the study of computers as persuasive technology—focuses on interaction. human-computer interaction (HCI), not on computer-mediated communication (CMC). Specifically, captology investigates how people are motivated or persuaded when interacting with computing products rather than through them. CMC is a separate area of research and design, with interesting intellectual questions to answer and big dollars at stake.[6] But it falls outside the realm of captology.

Under the CMC model, the computer is a channel that allows humans to interact with each other. For example, people in different locations may use computer tools, such as instant messaging and electronic whiteboards, to collaborate with one another. In this scenario, the computer facilitates communication; it does not persuade.

By contrast, in a human-computer interaction, the computing product is a participant in the interaction and can be a source of persuasion. The computer can proactively seek to motivate and influence users, drawing on strategies and routines programmed into it. It can encourage, provide incentives, and negotiate, to name a few strategies. In later chapters you’ll find examples of technology products that use such proactive persuasion techniques.

Persuasion Is Based on Intentions, Not Outcomes

At the start of this chapter, I defined persuasion as an attempt to change at- Captology focuses titudes or behaviors or both. This definition implies that true persuasion— on planned persua- whether brought about by humans or computers—requires intentionality. sive effects of tech- Captology focuses on the planned persuasive effects of computer technologies. nology, not on side This point about intentionality may seem subtle, but it is not trivial. Inten- effects. tionality is what distinguishes between a planned effect and a side effect of a technology.[7]

Chapter 1 Overview of Captology17

If you examine the history of computing technologies, you find that many high-tech products have changed the way people think, feel, and act. But most of these changes were not planned persuasive effects of the technology; they were side effects. Once people started using email, most probably changed how they used “snail mail”: they bought fewer stamps and went to the post office less often. Similarly, when video games came onto the market, kids started watching less television and played outside less often.[8]

Captology does not include such unintended outcomes; it focuses on the attitude and behavior changes intended by the designers of interactive technology products. These planned effects can range widely, from persuading people to buy things online, to motivating people to take stretch breaks after extended periods of desk work, to convincing people that bioterrorism is a serious threat.

Captology focuses on endogenous, or “built-in,” persuasive intent, not on exogenous intent.

One other point about intentions: Captology focuses on endogenous intent, that is, the persuasive intent that is designed into a computing product. A product also could acquire exogenous persuasive intent from users or another source—that is, if a product is adopted for a persuasive goal the designers hadn’t planned. For example, the Palm computer is not a persuasive product by design, but a student might buy it to motivate herself to do homework more regularly. The Sony CD Discman wasn’t designed to be persuasive, but a friend of mine bought one because she thought that the ability to listen to music during her workouts would motivate her to run more often. Captology does not focus on such exogenous intent but only on the endogenous persuasive intent built into a product.

Levels of Persuasion: Macro and Micro

Attitude and behavior changes that result from successful persuasion can take Technology can per- place on two levels: macro and micro. Understanding these two levels of per- suade on two levels: suasion will make it easier to identify, design, or analyze persuasion opportuni- macro and micro. ties in most computing products.

A game called HIV Roulette, which I’ll describe in more detail in Chapter 4, is designed to persuade users to avoid risky sexual behavior. Baby Think It Over, also detailed in Chapter 4, is designed to persuade teenage girls to avoid becoming pregnant. Persuasion and motivation are the sole reasons such products exist. I use the term macrosuasion to describe this overall persuasive intent of a product.

18 ■ Persuasive Technology

Some computing products, such as email programs or image manipulation software, do not have an overall intent to persuade, but they could incorporate smaller persuasive elements to achieve a different overall goal. I refer to this approach as microsuasion.

Microsuasion elements can be designed into dialogue boxes, icons, or interaction patterns between the computer and the user.[9] For example, in educational software applications, microsuasion techniques—such as offering praise or giving gold stars for completing a task—can lead to staying on task longer, getting a better understanding of the material, or strengthened brand loyalty.

Quicken, the personal finance application created by Intuit, provides a good example of how microsuasion can make a product more effective. The overall goal of the product is to simplify the process of managing personal finances. But note how the program uses microsuasion to achieve this goal. At the simplest level, the software reminds people to pay bills on time, helping them be financially responsible. The program also tracks personal spending habits and shows results in graphs, highlighting the financial consequences of past behavior and allowing projections into future financial scenarios. In addition, the software praises users for doing menial but necessary tasks, such as balancing their online check registry. These microsuasion elements—reminders, visualizations, and praise—are influence strategies embedded in the Quicken experience to change what users think and how they act.

Consider a few ways that microsuasion is used in CodeWarriorU.com, a site designed to teach people how to use the CodeWarrior tools to develop software applications. To convince users that its teaching methods are effective, the site uses testimonials, easily accessible from the homepage. To persuade users to enroll, the homepage extols the benefits of at least a dozen courses, casting a wide net in making the sales pitch to prospects. In addition, no matter where users go on the site, on every page they see invitations to enroll, in the form of prominent buttons that say “Register” and “Enroll now.” Furthermore, the site reduces barriers to enroll: it’s free and easy to do.

The site also uses microsuasion techniques to motivate users to continue making progress in their chosen course. Each course has a schedule with a firm ending date, which serves both to set work expectations and a deadline. Each lesson has tracking features that help users see how much they’ve completed and how much work remains. The CodeWarriorU.com system also tracks students by maintaining a transcript that includes completion dates of assignments and performances on quizzes. To further motivate users to continue progressing, the site makes enrollment public to other students through a class roster and discussion area, as well as by sending preprogrammed emails that

Chapter 1 Overview of Captology19

prompt users to complete their work. All of these microsuasion elements contribute to the overall learning goal of CodeWarriorU.com.[10]

Microsuasion on the Web

Examples of Web sites that use microsuasion are plentiful and sometimes subtle. For example, eBay has created a rating system—what it calls “feedback”— whereby buyers and sellers evaluate each other after a transaction is completed. This system motivates people to be honest, responsive, and courteous in their interactions. Similarly, the survival of epinions.com, a site that “helps people make informed buying decisions,”[11] hinges on persuading people to share their opinions online. To encourage this, epinions hands out highly visible titles of status (“Top Reviewer” and “Editor”) when people contribute many reviews that are valued by readers. Classmates.com uses the lure of curiosity— finding out more about high school classmates—to persuade browsers to register their personal information at the site. Once registered, users have access to the information about others in their class who have registered. In their overall macrosuasive goal of motivating people to quit smoking, Quitnet.com uses public commitment (announcing your quit date) as a microsuasion strategy. All of these techniques involve persuasion on a micro level.

Microsuasion in Video Games

Video games are exceptionally rich in microsuasion elements. The overall goal of most games is to provide entertainment, not to persuade. But during the entertainment experience, players are bombarded with microsuasion elements, sometimes continuously, designed to persuade them to keep playing.

WarCraft III is a real-time strategy (RTS) game that uses microsuasion elements to make the game compelling (if not addictive for some). Throughout the game, as players kill enemies, the player hears a dying sound, an audio reinforcement for succeeding. If players kill monsters, who are neither friend nor foe in this game, the dying monsters drop gold or other items of value that the player can use later as resources. The prospect of gaining new powers also serves as microsuasion. Specifically, if a “hero” belonging to one of the players progresses to the next level, the player can select a new power for that individual, such as the ability to heal others. And of course, players are motivated by the challenge of getting themselves ranked on the high score list.

20 ■ Persuasive Technology

As the previous discussion suggests, designers of products such as Baby Think It Over must understand macrosuasion techniques to succeed in their overall goal of persuasion. But even designers of products such as productivity software—products that do not have persuasion as their primary goal—must understand how persuasion techniques can be used at the micro level in order to make their products more effective and successful.

Captology: Summary of Key Terms and Concepts

  1. For purposes of captology, persuasion is defined as an attempt to change attitudes or behaviors or both (without using coercion or deception).

  2. Captology focuses on attitude or behavior change resulting from humancomputer interaction (HCI), not from computer-mediated communication (CMC).

  3. Captology focuses on planned persuasive effects of technology, not on side effects of technology use.

  4. Captology focuses on the endogenous, or “built-in,” persuasive intent of interactive technology, not on exogenous persuasive intent (i.e., intent from the user or another outside source).

  5. Captology recognizes that technology can persuade on two levels, macro and micro.

Notes and References

For updates on the topics presented in this chapter, visit www.persuasivetech.info.

  1. Persuasion scholars don’t agree on a single definition of persuasion. For example, Reardon defines persuasion as “the activity of attempting to change the behavior of at least one person through symbolic interaction” (Reardon 1991, p. 3). Other scholars (including myself) view persuasion more broadly. For example, see D. Forsythe, Our Social World, (New York: Brooks/Cole, 1995). Also, in their definition of persuasion, Zimbardo and Leippe (1991) extend persuasion to encompass changing a person’s “behaviors, feelings, or thoughts about an issue, object, or action” (p. 2). Other scholars expand persuasion beyond the idea of “changing”; persuasion includes shaping and reinforcing (Stiff 1994).

Chapter 1 Overview of Captology21

If you are interested in investigating the definition of persuasion further, these sources are a good starting point:

  • a. K. K. Reardon, Persuasion in Practice (Newbury Park, CA: Sage, 1991).

  • b. P. G. Zimbardo and M. Leippe, Psychology of Attitude change and Social Influence (New York: McGraw-Hill, 1991).

  • c. J. B. Stiff, Persuasive Communication (New York: Guilford, 1994).

  1. The line between persuasion and coercion can be a fine one. Consider dialog boxes that won’t go away until you’ve answered the questions they pose; sites that require you to provide personal information before you can view their “free” content; and ads that pop up right over the part of the page you are trying to read. These and other “persuasive” techniques may be viewed as subtly coercive and may have a cumulatively negative effect on users.

  2. C. Castelfranchi, Artificial liars: Why computers will (necessarily) deceive us and each other, Ethics and Information Technology, 2:113–119 (2000).

  3. For example, both Reardon (1991) and Zimbardo and Leippe (1991) discuss distinctions in persuasion, coercion, and deception. See

    • a. K. K. Reardon, Persuasion in Practice (Newbury Park: Sage, 1991).

    • b. P. G. Zimbardo and M. Leippe, Psychology of Attitude change and Social Influence (New York: McGraw-Hill, 1991).

  4. T. J. M. Bench Capon and A. M. McEnery, People interact through computers, not with them, Interacting with Computers, 1(1): 48–52 (1989).

  5. Computer-mediated communication (CMC) is a large area, so it’s difficult to single out one article or person to represent the work in this domain. For a broad picture of CMC, visit John December’s online resource about computer-mediated communication at http://www.december.com/cmc/info/ (note: this is a for-profit effort). His site gives pointers to more specific areas in CMC, such as conferences, journals, and organizations.

  6. Stanford professor Donald Roberts was the first to help me clearly see the distinction between effects and effectiveness, including the key role intention plays in interpreting outcomes. I use different terms in my writing (planned effects versus side effects), but the concept is the same. Don Roberts and Nathan Maccoby address the issue of intended and unintended outcomes in the following:

    • D. F. Roberts and N. Maccoby, Effects of mass communication, in G. Lindzey and E. Aronson (eds.), The Handbook of Social Psychology, 3rd ed., vol. II (New York: Random House, 1985), pp. 539–598.
  7. A 1999 study by Nielsen Media Research documents that kids are watching less TV and proposes that one factor is competition from video games. For a brief summary of this research, see http://www.ncpa.org/pd/social/pd120299h.html.

22 ■ Persuasive Technology

A longer article, drawing on various studies, that talks about the decline in kids’ TV watching and suggests that computer games are a factor, is Lauren Rublin’s “Tuning Out,” published in Barron’s on November 8, 1999.

  1. Many common interaction patterns found in human-human interactions can be applied to HCI. For example, the “door in the face” technique involves asking a big favor to which a person is likely to say no, then exploiting the guilt the person feels in order to persuade him or her to do a smaller favor.

  2. One could argue that the real purpose of CodeWarriorU.com is not to help students to learn but to sell them books and software for each course. Even so, my main point still applies: the microsuasion elements I outline contribute to a larger overall goal.

  3. This quote about the purpose of epinions comes from http://www.epinions.com/about/.

Pandaemonium Architecture 6.0 — ATEK-639/439 — Fall 2025