Showing posts with label Marketers. Show all posts
Showing posts with label Marketers. Show all posts

Thursday, December 19, 2013

‘Data brokers’ selling personal info of rape victims to marketers - report

AFP Photo / Jean-Sebastien Evrard

“Data brokers” track, categorize and sell personal health information for marketing use, a new US Senate report reveals. Data groupings include rape victims and HIV-positive individuals, those with depression and dementia, and womens’ gynecologist visits.


Hundreds of so-called “data brokers” in the US maintain databases made up of Americans’ sensitive health details. A report by the Senate Commerce Committee says the companies are legally allowed to withhold from individuals what data is collected, how one is categorized and who buys the information.


The report on the global multi-billion dollar industry was released Wednesday ahead of a committee hearing on such practices. Though the report does not detail wrongdoing, it does point out the reams of consumer data made available to marketers in the digital era. The information is used for targeted advertising across the web.


“Millions of consumers are now using computers, smart phones, and tablets to make purchases, plan trips, and research personal financial and health questions, among other activities,” the Senate report explains. “These digitally recorded decisions provide insights into the consumer’s habits, preferences, and financial and health status.”


During the Senate Commerce Committee hearing on Wednesday, privacy groups warned how far some companies have gone to amass and sell a person’s confidential information.


“There are consumer list brokers that sell lists of individually identifiable consumers grouped by characteristics. To our knowledge, it is not practically possible for an individual to find out if he or she is on these lists,” said Pam Dixon, executive director of the World Privacy Forum, in her testimony. “If a consumer learns that he or she is on a list, there is usually no way to get off the list.”


Dixon named one broker, MEDbase200, that has auctioned off lists of rape and domestic violence victims.


The committee found another, Epsilon, that offered at least one list of people who allegedly have medical conditions including anxiety, depression, diabetes, high blood pressure, insomnia, and osteoporosis.


An Epsilon spokesperson Diane Bruno told the Wall Street Journal many consumers report the information themselves on the company’s opinion-research website, and that Epsilon cooperated with the Senate committee. Yet she defended the shielding of lists from individuals.


“We also have to protect our business, and cannot release proprietary competitive information, or information that we’re prohibited from releasing based on contractual agreements with our clients.”


The report showed Equifax, one of the biggest consumer credit reporting agencies in the US, keeps a database that includes women’s visits to gynecologists within the last year.


The largest broker, Acxiom, allows consumers to view and amend the data gathered on them, yet the company does not allow anyone to see how information on them is being used.


The Senate report says the assembled information does not stop with personal health. People’s incomes, home loans and pets, for example, are used to put individuals into groups like “rural and barely making it” and “ethnic city strugglers.”


A great deal of the consumer data collected by the companies is inaccurate, according to reporting by the Wall Street Journal.


In some cases, compiled databases reveal contact information that is not allowed to be made public, the report found.

“This is where lawmakers can work to remove unsafe, unfair, and overall just deplorable lists from circulation,” Dixon said during the hearing. “There is no good policy reason why unsafe or unfair lists should exist.”


A recent study by the Government Accountability Office found that federal law does not protect consumers’ right to know what is being collected or how the data is used.


The Fair Credit Reporting Act allows for consumers to correct any credit information an agency may provide to landlords, employers, banks and others. The Health Insurance Portability and Accountability Act bars health providers and insurance companies from offering patient information to outside entities. Yet that federal law does not cover data-mining brokers that sell health profiles. The Federal Trade Commission has called on brokers to be more transparent with the data.


“Current federal law does not fully address the use of new technologies, despite the fact that social media, web tracking, and mobile devices allow for faster, cheaper and more detailed data collection and sharing among resellers and private-sector entities,” the Senate report says, calling for more oversight of the industry.


The hearing’s revelations pertaining to MEDbase200, the company that collected information on victims of sexual assault and domestic violence, led to the company removing those lists from its website. A spokesperson for the company’s parent organization told the Wall Street Journal MEDbase200 did not intend to peddle any list entitled “rape sufferers” – which it was, at a price of $ 79 for 1,000 names – and that it was only a “hypothetical list of health conditions/ailments” created for internal use.


Upon questioning from the Wall Street Journal, MEDbase200 also nixed lists of HIV/AIDS patients and “peer pressure sufferers” that were for sale.





BlackListedNews.com



‘Data brokers’ selling personal info of rape victims to marketers - report

​‘Data brokers’ selling personal info of rape victims to marketers – report


AFP Photo / Jean-Sebastien Evrard
AFP Photo / Jean-Sebastien Evrard


“Data brokers” track, categorize and sell personal health information for marketing use, a new US Senate report reveals. Data groupings include rape victims and HIV-positive individuals, those with depression and dementia, and womens’ gynecologist visits.


Hundreds of so-called “data brokers” in the US maintain databases made up of Americans’ sensitive health details. A report by the Senate Commerce Committee says the companies are legally allowed to withhold from individuals what data is collected, how one is categorized and who buys the information.


The report on the global multi-billion dollar industry was released Wednesday ahead of a committee hearing on such practices. Though the report does not detail wrongdoing, it does point out the reams of consumer data made available to marketers in the digital era. The information is used for targeted advertising across the web.


“Millions of consumers are now using computers, smart phones, and tablets to make purchases, plan trips, and research personal financial and health questions, among other activities,” the Senate report explains. “These digitally recorded decisions provide insights into the consumer’s habits, preferences, and financial and health status.”


During the Senate Commerce Committee hearing on Wednesday, privacy groups warned how far some companies have gone to amass and sell a person’s confidential information.


“There are consumer list brokers that sell lists of individually identifiable consumers grouped by characteristics. To our knowledge, it is not practically possible for an individual to find out if he or she is on these lists,” said Pam Dixon, executive director of the World Privacy Forum, in her testimony. “If a consumer learns that he or she is on a list, there is usually no way to get off the list.”


Dixon named one broker, MEDbase200, that has auctioned off lists of rape and domestic violence victims.


The committee found another, Epsilon, that offered at least one list of people who allegedly have medical conditions including anxiety, depression, diabetes, high blood pressure, insomnia, and osteoporosis.


An Epsilon spokesperson Diane Bruno told the Wall Street Journal many consumers report the information themselves on the company’s opinion-research website, and that Epsilon cooperated with the Senate committee. Yet she defended the shielding of lists from individuals.


“We also have to protect our business, and cannot release proprietary competitive information, or information that we’re prohibited from releasing based on contractual agreements with our clients.”


The report showed Equifax, one of the biggest consumer credit reporting agencies in the US, keeps a database that includes women’s visits to gynecologists within the last year.


The largest broker, Acxiom, allows consumers to view and amend the data gathered on them, yet the company does not allow anyone to see how information on them is being used.


The Senate report says the assembled information does not stop with personal health. People’s incomes, home loans and pets, for example, are used to put individuals into groups like “rural and barely making it” and “ethnic city strugglers.”


A great deal of the consumer data collected by the companies is inaccurate, according to reporting by the Wall Street Journal.



In some cases, compiled databases reveal contact information that is not allowed to be made public, the report found.“This is where lawmakers can work to remove unsafe, unfair, and overall just deplorable lists from circulation,” Dixon said during the hearing. “There is no good policy reason why unsafe or unfair lists should exist.”


A recent study by the Government Accountability Office found that federal law does not protect consumers’ right to know what is being collected or how the data is used.


The Fair Credit Reporting Act allows for consumers to correct any credit information an agency may provide to landlords, employers, banks and others. The Health Insurance Portability and Accountability Act bars health providers and insurance companies from offering patient information to outside entities. Yet that federal law does not cover data-mining brokers that sell health profiles. The Federal Trade Commission has called on brokers to be more transparent with the data.


“Current federal law does not fully address the use of new technologies, despite the fact that social media, web tracking, and mobile devices allow for faster, cheaper and more detailed data collection and sharing among resellers and private-sector entities,” the Senate report says, calling for more oversight of the industry.


The hearing’s revelations pertaining to MEDbase200, the company that collected information on victims of sexual assault and domestic violence, led to the company removing those lists from its website. A spokesperson for the company’s parent organization told the Wall Street Journal MEDbase200 did not intend to peddle any list entitled “rape sufferers” – which it was, at a price of $ 79 for 1,000 names – and that it was only a “hypothetical list of health conditions/ailments” created for internal use.


Upon questioning from the Wall Street Journal, MEDbase200 also nixed lists of HIV/AIDS patients and “peer pressure sufferers” that were for sale.


Source: RT






End the Lie – Independent News



​‘Data brokers’ selling personal info of rape victims to marketers – report

Wednesday, July 24, 2013

Online Marketers Take Note Of Brains Wired For Rewards





Popular online games like FarmVille use powerful reward systems to get players to spend real-world money on virtual items.



Zynga/AP

Popular online games like FarmVille use powerful reward systems to get players to spend real-world money on virtual items.



Popular online games like FarmVille use powerful reward systems to get players to spend real-world money on virtual items.


Zynga/AP



Ask yourself: Are you addicted to technology — any technology? Do you check email obsessively, tweet without restraint or post on Facebook during Thanksgiving dinner? Or perhaps you are powerless in the face of an iPad loaded with Angry Birds?


Many of the most popular technologies of our time tap into powerful reward mechanisms in our brains. And while most researchers stop short of calling video games and modern tech addictive, there’s evidence that these technologies alter how our brains work and change how we behave.


Research has even demonstrated that gamers will get a boost of dopamine when they play.


Many techies and marketers are tapping, sometimes unintentionally, into decades of neuroscience research to make their products as addictive and profitable as possible.


A couple of weeks ago I got a pitch from Uber, the creators of the car service app of the same name. Every once in a while when you open the Uber app, you are greeted with a surprise, and the company will offer an unexpected service.


“We’ve done pedicabs in Austin,” says Travis Kalanick, Uber’s co-founder and CEO, “[and] we’ve done on-demand Texas barbecue. We’ve done Uber chopper and we’ve done on-demand roses on Valentine’s Day.”


Last Friday, the surprise was on-demand ice cream.


“It’s not our core business; it’s not what we do normally,” Kalanick says. “It’s just fun.”


The thing about these PR stunts is that customers love them. Traffic to Uber skyrocketed Friday. The other thing is that you never know when to expect these little rewards, so it pays to check Uber’s app and click, and then click again.


And something about that reminded me of a very old, very famous psychology experiment known as the Skinner Box.


“An unexpected reward has much more power than one that is regular in driving behavior,” says Nora Volkow, the head of the National Institute on Drug Abuse. “This has been known for a very long time.”


More than 60 years ago, the famous American psychologist B.F. Skinner demonstrated that unpredictable rewards created obsessive behavior in lab rats. The rats would click and click and click again on a bar, hoping to trigger a random reward.


“We are not mad scientists trying to figure out unexpected reward systems that Skinner predicated in theories decades ago; that’s not us,” Uber’s Kalanick says.


Still, random reward structures are built, sometimes unintentionally, into many of the technologies we use everyday.


Even responses to tweets or Facebook posts offer unpredictable rewards. Just talking about ourselves triggers reward mechanisms in our brains. When people pay attention to what we say, it feels even better.


But think about it: Do you know ahead of time which tweets will be retweeted or which posts on Facebook will attract likes? You don’t. So it’s a bit of a crap shoot, but when a post takes off, it feels great.


Rewards in video games are designed to be intentionally surprising. Even the ping of an incoming email contains the hope of unanticipated pleasure.


Some think all of this could be driving compulsive behaviors in people that can resemble Skinner’s rats in a box. Over the past decades, researchers have realized much of this reward-seeking behavior is driven by dopamine.


“Writing a blog that then becomes viral will then hook you to want to repeat that act — that specific experimental story has not been done,” Volkow says. “But equivalents have actually [been] shown. The first one was many years ago in which they had people playing a video game, and when individuals got a point, dopamine got activated — an unexpected reward.”


Volkow and others have studied how the human brain releases dopamine in anticipation of a variety of rewards, from sex to food to cocaine.


We even get a bit of dopamine when we talk about ourselves, which might help explain Facebook’s global popularity.


Dopamine is the brain’s way of rewarding behaviors that helped humans survive. It’s released when we eat or have sex or learn, but Volkow and others have shown that when it’s manipulated with drugs, the dopamine response in our brains plays an important role in addiction.


While it is far too soon to say that video games or other types of technology are truly addictive, there is evidence that avid gamers, for example, process these kinds of neurochemical rewards differently.


Volkow says when she sees stories about people spending real money for imaginary or virtual products in games like FarmVille, she’s reminded of research that used dopamine to manipulate rats through a complex maze.


“They actually wanted rats to be able to act like little spies, like little robot spies,” Volkow says. “You could put a [recorder] in the rat and the rat just has to go where you want it to go and record the conversations that are happening.”


Volkow says they designed the rats basically by manipulating, with electrodes, these dopamine reward systems.


When the animals headed in the right direction, they received the sensation of pleasure. Rats with with these electrodes wired into their brains and connected via a wireless backpack climbed ladders, navigated through complex mazes and would do almost anything the researchers wanted them to do.


“There was nothing in it for the rat except the sensation of reward,” Volkow says.


Ramin Shokrizade says a well-designed video game works in a very similar way. “I think that analogy translates completely to humans,” Shokrizade says.


Shokrizade studied neuroscience before switching careers, and now he helps video game companies monetize their games.


“I would say my primary job when I am creating a monetization model for a game is to do exactly the same thing to humans,” he says.


Shokrizade believes that the rush of pleasure games provide can be addictive. And he says some game designers have made a fortune by creating games that slowly encourage players to pay for that rush of pleasure.




News



Online Marketers Take Note Of Brains Wired For Rewards