STUDENTS AT MIT, STANFORD, AND ELSEWHERE ARE LEARNING TO BUILD TECHNOLOGY PRODUCTS WITH USER WELL-BEING IN MIND.
Facebook sparked outrage this summer when it published results of a study conducted on unwitting users. The study looked at whether people who were shown more positive or negative words in friends' posts would write more positive or negative words themselves, apparently without considering the ethics of manipulating users' emotions. (Indeed, users shown more negative words were more negative in their posts and vice versa.) "Was this designed to create maximum benefits for the end user? I can’t really see that anywhere in that research," says Marc Smith, a sociologist who spent 10 years as a researcher at Microsoft. "Where is it saying, 'This is how we will now deal with people who are borderline depressed, we’re going to start steering them toward happier stuff?'"