How would you feel if an airline assigned you a seat based on your body size, using data your credit card issuer collected from your shopping habits?
That hypothetical scenario could one day become reality if the U.S. Patent and Trademark Office approves a Mastercard patent application that was filed in 2015 and became public last December.
In its application, Mastercard seeks to establish “a system, method, and computer-readable storage medium” that would analyze their cardholders’ physical dimensions, based on things such as dress, pants and shoe size of charged purchases, and allow them to share your body stats with airlines, railways and bus companies.
As to how this system might affect airline travel, the Mastercard application suggests that “for the comfort of their passengers, transportation providers should avoid seating physically large strangers next to each other.”
The application explains that Mastercard can often obtain your clothing and shoe size from the Stock Keeping Unit (or SKU) number on the items you charge. SKU data also helps the card company estimate the size and weight of your whole family, as well as exclude wearables purchased as gifts, due to their “rarely purchased sizes.”
As a nod to consumer rights, the application notes that Mastercard “may first need to” obtain cardholder consent and opt-in to use such “Personally Identifiable Information.”
The travel-news site Skift first spotted the application. We asked Mastercard for comment, and company spokesman Seth Eisen said the company has no immediate plans for it.
“We are constantly innovating and file many patents,” Eisen said. “For this particular patent, we have not focused on its application and have no further information to share.”
News of the patent application did prompt an immediate response from Susan Grant, the director of consumer protection and privacy for the Consumer Federation of America, an association of more than 250 nonprofit consumer groups.
“I’m horrified,” she admits. “Sometimes when you have the ability to collect information, the impetus is often to figure out how to monetize it, and this is just another example.
“Everybody is in the data-grabbing business these days; whether they need that data for their own purposes or not, they may be able to sell it to somebody. Your personal information is the commodity now.”
Grant’s concerns are threefold:
• Consumer consent: “I’m not sure if Mastercard customers would know about it, and if they did, what sort of control they would have over it,” she says. “At the very least, I would think that they would have to opt in for their information to be used for purposes like this.”
• Data accuracy: “When my father, who recently passed away, was wheelchair-bound the last few months of his life, I was using my credit card to buy clothes for him and a lot of other things, on the basis of which I now know that assumptions are being made about me because I’m getting advertisements and coupons for things that I don’t need,” she recalls. Where would Mastercard’s data sharing stop?
• Inappropriate conduct: “For a company that you have a relationship with for purposes not related to this to make those assumptions based on the information they can glean and provide it to third parties is an example of how (data mining) can go beyond the pale,” she says.
The Mastercard patent application also raises another beef that consumer advocates have with the airline industry: As Americans have gotten larger, airline seats have gotten smaller.
“It raises a lot of questions about whether airlines should accommodate their customers by actually making their seats more comfortable rather than less, and whether you should have to pay a premium for that,” she says. “You’re getting less and paying more.”
Grant wouldn’t be surprised to see Mastercard share its cardholder biodata with the transportation industry.
“There’s not really any law that prevents it, and we really haven’t set public policy parameters yet about even such basic things as what control people should have, let alone other, more fundamental issues about what’s fair and what’s not when it comes to collecting and using data about people,” she explains. “This is a great example of a potential privacy nightmare.”