10 Ways Technology Hijacks Your Behavior Data, gamification and other means of psychological manipulation can influence you to make healthy choices, work harder, spend money and more.
Opinions expressed by Entrepreneur contributors are their own.
From push notifications and reminders to ratings and rewards programs, technology has the power to nudge you to think and act in specific ways at specific times.
Addictive design keeps you hooked, algorithms filter the ideas and options you're exposed to and the data trail you leave behind comes back to haunt (or target) you later. The virtual world is full of features and ads that may trick you to adopt beliefs, buy products or just stare at the screen longer.
These tricks can have good intentions: A fitness app might encourage you to run an extra mile, while a calendar alert might remind you of a big meeting. Other times, tech might distract you from important tasks, spending quality time with loved ones or activities that would serve your best interests.
Read on for 10 ways technology is hijacking your behavior, for better or for worse.
1. It beckons.
Anyone who has a smartphone knows that it can be difficult to ignore that buzzing, beeping, incessantly illuminated screen, even in situations when it detracts from your presence, such as in meetings or at the dinner table.
App makers push notifications to get users to engage. That's why, for instance, Instagram tells you when someone you follow has posted for the first time in a while, luring you to open the app and take a look.
One of today's most prominent activists working to raise awareness about tech's influence over our attention, behavior and overall well-being is Tristan Harris. He formerly served as product philosopher at Google, and he's the co-founder of the Center for Human Technology (and the Time Well Spent movement).
In one essay, Harris explains that smartphones and the apps that run on them resemble slot machines in their design. As a result, the average person checks their phone 150 times a day, often unconsciously, and that's because, when they do, they are setting themselves up to receive a "variable reward." They might get nothing -- no new notifications or messages -- or they might get a link to a funny meme from a friend, a photo of a baby or news of progress on a project they're working on.
There's also an obligation factor that drives the impulse to check personal devices: We might miss something important if we don't, or we might offend someone by not responding quickly enough or reciprocating a gesture.
2. It takes up mental space.
Even when we're not looking at our phones, and we've made a conscious effort to ignore them, such as turning off notifications and ringers or powering them off entirely, they still can distract us.
Researchers from the University of Texas at Austin, the University of California, San Diego and Disney Research recently conducted a study and found that when a person's smartphone is nearby -- on the table or even in the same room -- that person's performance on a cognitive task (requiring problem-solving and reasoning) will likely suffer.
Related: Why Just Having Your Phone Near You Messes With Your Brain
The diminished ability is akin to what sleep deprivation might cause, the researchers found, noting that people performed best on tasks when their phone was in another room and worst when their phone was on the table, whether the phone was on or off.
In a summary of their findings in Harvard Business Review, the researchers explain that "humans learn to automatically pay attention to things that are habitually relevant to them, even when they are focused on a different task." Ignoring something that's calling for your attention takes a lot of effort and consumes your attention. Just think of a time when you were working on something and someone called your name from across the room. Chances are, you lost focus.
3. It alters your perception of your options.
The internet opens up a whole new world. You might Google "cafes" and discover a new lunch spot that you otherwise might not have known about. You might need a new pair of shoes, and rather than being constricted to the options local brick-and-mortar retailers have to offer, you can pick from countless pairs and have one shipped to your door.
Even though we theoretically have access to what can seem like every product, place of business and source of information via the web, we often browse these options through platforms that filter them for us, to narrow down the seemingly infinite array. What we don't always think about, Harris explains, is that we might miss a great option if we only choose from what an algorithm serves up.
Harris provides the hypothetical scenario of a group of friends searching Yelp for a nearby bar to go to after dinner. "The group falls for the illusion that Yelp's menu represents a complete set of choices for where to go," he writes. "While looking down at their phones, they don't see the park across the street with a band playing live music. They miss the pop-up gallery on the other side of the street serving crepes and coffee."
4. It reinforces your beliefs.
The 2016 U.S. presidential election heightened public awareness of a concept known as the "filter bubble," coined by Upworthy co-founder Eli Pariser and explored in his 2011 book The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think.
Simply put, the filter bubble is a phenomenon that occurs with users online. Of course, this dynamic exists offline, too -- we make friends who have similar interests and ideologies, for example. This might limit our thinking, but can it influence our behavior?
A 2016 study published in the Croatian Medical Journal explored the effects of filter bubble on the personal health information people saw online. For example, if someone has a network of friends that don't believe in the effectiveness and safety of vaccinations, they might be more likely to search "vaccines and children" and receive search results that suggest vaccinations are dangerous. This also has the risk of reinforcing any preconceived notions, due to confirmation bias.
"The search history, social network, personal preferences, geography and a number of other factors influence the information found by the searcher," study author Harald Holone wrote in a summary of his findings.
Given the fact there is so much information online, we may have the illusion that we are exposed to a range of ideas when really we're building a virtual echo chamber for ourselves. And due to the addictive nature of technology, it's difficult to escape the filter bubble without a conscious effort. Some people have actively tried to counter it by collecting Facebook friends with opposing viewpoints and liking pages that don't interest them.
5. It collects information about you that can be used to influence you later.
Related to the filter bubble concept, all web and social platform users are familiar with how targeted advertising works. You Google something, look for a product on Amazon, put an item in your virtual shopping cart, browse flight booking options -- then, maybe hours or even weeks later, you see an ad for whatever you were eyeing earlier.
It's pretty clear what's happening here: Sellers are trying to influence your decision to buy. There are ways to get around this type of targeting, from adjusting your Facebook settings to clicking on individual ads and specifying that you wish not to see ads of that nature. Or, you can install an ad blocker to spare yourself all together.
Related: The Shocking Lessons I Learned After I Quit My Social Media Addiction in 3 Days in the Desert
Increasingly, people are worried that their information might be used for more than just selling them stuff. In March 2018, news surfaced of a data breach that resulted in data of about 50 million Facebook users getting into the hands of voter-targeting consultancy Cambridge Analytica. Because Cambridge Analytica had ties to the Trump campaign prior to the 2016 election, some suspect that data may have been harnessed to sway voters via targeted political messaging.
Regardless of how the data was used, the revelation has prompted Facebook to crack down on third-party developers' access to user data, clarify privacy settings and cut off the use of data from third-party aggregators to supplement its own ad targeting.
6. It keeps serving up the next thing.
Social media feeds allow users to scroll endlessly, but that's only one example of the never-ending waterfall of information that users encounter online. After watching a video on Netflix, Facebook or another site that hosts video content, you'll often see a countdown with a preview of another video that will autoplay after a few seconds. This tactic serves to keep you engaged and watching something new, even when you don't intend to.
Even Entrepreneur.com autoplays videos after one finishes. Content creators and distributors naturally want you to keep reading and watching. That's also why we have links to other stories on article pages.
Usually, these options are related to the first piece of content a user consumed, algorithmically generated or hand-picked because they are likely to be relevant to someone based on the content they selected initially.
Some content providers have learned over time that there is a limit to the effectiveness of autoplay, when it happens right out the gate. Overwhelmingly, users respond negatively when videos autoplay (especially with sound on) if they haven't yet opted in to watch on say, a page they've visited with the goal of reading an article. Many are moving away from this model, driven by video advertising, which bombards users and often drives them away.
7. It shortens your attention span.
"Ten years ago, before the iPad and iPhone were mainstream, the average person had an attention span of about 12 seconds," said Adam Alter, the author of Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked, in an interview with NPR's Fresh Air. "Research suggests that there's been a drop from 12 to eight seconds ... shorter than the attention of the average goldfish, which is nine seconds."
Alter is not the only one to observe this phenomenon. Nicholas Carr, author of The Shallows: What the Internet Is Doing to Our Brains, preceded his book with a 2008 Atlantic article in which he explained, "My mind now expects to take in information the way the net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski."
Because there is so much to see online, with hyperlinks, notifications and the mere existence other sites and apps, distractions are hard to resist. Tweets, then a maximum of 140 characters (and today a still brief 280), are easily skimmable, and posts on other social media platforms are often equally digestible at first glance.
We're neurologically programmed to seek the pleasure or reward of new bite-sized pieces of content provide, which causes us to enter a "dopamine loop," behavioral psychologist Susan Weinschenk explains in Psychology Today. "When you bring up the feed on one of your favorite apps the dopamine loop has become engaged," Weinschenk writes. "With every photo you scroll through, headline you read or link you go to, you are feeding the loop which just makes you want more. It takes a lot to reach satiation, and in fact you might never be satisfied. Chances are what makes you stop is that someone interrupts you."
8. It can trick you into thinking it's something more.
In the movie Her, Joaquin Phoenix's character, Theodore, falls in love with his virtual assistant, Samantha. But this phenomenon isn't confined to science fiction. Humans have the potential to form relationships with artificially intelligent personas. Even if we know we're not talking to a real person when we're typing back and forth with a chatbot, for example, if the AI seems sophisticated or real enough, our minds might get tricked into interacting with it as if it were.
Liesl Yearsley sold her machine-learned powered virtual assistant startup Cognea to IBM Watson in 2014. As she writes for MIT Technology Review, because of the always-on, helpful nature of AI, people tend to perceive assistants as loyal and trusted companions, engaging in lengthy conversations and sharing personal details. She and her team built assistants with different objectives coded into their interactions with humans, e.g. boosting sales.
Related: Future Technology Will Bring Terrifying Prospects You Can Innovate Against
"The giant companies at the forefront of AI -- across social media, search, and ecommerce -- drive the value of their shares by increasing traffic, consumption and addiction to their technology," Yearsley writes. "They do not have bad intentions, but the nature of capital markets may push us toward AI hell-bent on influencing our behavior toward these goals."
Speaking of the blurry line between humans and AI, ethicists have explored questions such as whether it's OK to inflict violence on robots.
9. It turns everyday actions into games.
Gamifying certain behaviors is a powerful way to incentivize people to engage in them. Think of how fitness apps encourage you to set goals, compare your performance to other users and congratulate you when you hit milestones. Or, how brands you shop with remind you about the number of loyalty points you've accumulated and entice you with the next reward you're eligible to unlock.
Then, there's gamification as it pertains to work. For example, ride-hailing apps such as Uber manipulate drivers, who are independent contractors and don't have scheduled shifts, to stay on the road longer. When a driver is about to log off, Uber will sometimes push them a notification that they may be eligible for a pay bonus if they continue shepherding customers for a bit longer. In the past, it's offered tiered bonuses to drivers who complete a certain number of rides on busy nights such as Halloween or New Year's Eve.
Uber also issues praise such as "Above and Beyond" to drivers who perform well based on rider feedback, according to The New York Times. The Times also reported that Uber has "experimented with video game techniques, graphics and non-cash rewards of little value that can prod drivers into working longer and harder -- and sometimes at hours and locations that are less lucrative for them."
Uber is just one gig worker platform that uses gamification tactics to incentivize its workers. Instacart, Postmates and DoorDash -- other platforms that rely on gig workers -- offer performance metrics and other information designed to incentivize output.
10. It changes how we communicate.
Technology is a double-edged sword, in many ways. It distracts us, but it also gives us access to information and allows us to communicate globally and efficiently. The same goes for its implications for how we communicate. The ability to type quickly and distribute our ideas via the web makes mass communication possible to any individual with an internet connection. Social media helps us maintain communication with friends and family who live far away, or helps us establish relationships with people with common interests or potential collaborators we wouldn't otherwise know.
But some research has shown that the more a person uses technology to communicate, the greater anxiety they experience when it's time for a face-to-face interaction. Some parents raising children in the smartphone and tablet era limit their kids' "screen time," because they believe emerging research that shows that speech and language development hinges on everyday human interactions.
Then there's the concept of "phubbing" -- snubbing an in-person companion by looking at your phone. A 2016 study published in Computers in Human Behavior found that 17 percent of smartphone users "phub" four times a day. More recently, a 2018 Journal of Applied Social Psychology study revealed that when people get phubbed, they perceived the quality of their communication and relationship with the phubber to be negatively affected. This was because getting phubbed reduced their sense of belonging.
Related video: How This Entrepreneur Uses Technology to Stay on Top of Things