If you work for Uber or DoorDash, your boss isn't a person but an algorithm
App-driven jobs in the gig economy can mean constant surveillance
The work day ends and you Uber home, leaving the driver a five-star review for a blissfully uneventful ride.
Your stomach rumbles, and since you're already holding your phone, you swipe over to another app and order dinner, adding a $3 tip for the delivery person.
But who's really pocketing that extra $3?
The exchange is mediated by an app, and because of that dynamic, the only way to really know whose wallet the tip is ending up in would be to ask the delivery person (awkward) or look at their screen (creepy and awkward). Plus, the extra effort of doing so would negate the frictionless ease of such a transaction.
Digital apps like Uber, Lyft, DoorDash and Postmates are all part of the gig economy. Arguably, they're advantageous for both sides: You get your dinner or a ride home from work in a few simple swipes, and someone else gets to make some quick cash, with the freedom of being their own boss.
Or are they?
Many gig workers don't realize, in fact, that they do have a boss. It's just not a human being.
Employees or not?
On its website, Uber refers to drivers and delivery people as "partners." They say, "When you drive with Uber, you're an independent contractor." DoorDash calls its delivery people "Dashers," and like Uber, clearly states that they are not employees of the company.
Critics say that just because these billion-dollar businesses classify the people who power them as "independent contractors" rather than employees doesn't mean that's really the case.
According to Alex Rosenblat, the author of Uberland: How Algorithms Are Rewriting the Rules of Work, the significance of this distinction is that drivers are largely excluded from the protections of employment law, such as a minimum wage, anti-discrimination law or the ability to collectively bargain over pay.
Indeed, some companies have even tried to make the claim that the workers driving you around or delivering your food are in fact "consumers" themselves, arguing that they're just users of the app, employing a different feature set.
The rationale is that if drivers are considered consumers or customers, the regulatory framework in which they are categorized can be shifted from labour and employment law to consumer protection law.
"When Uber does it, it's to argue that employment law doesn't apply to drivers, despite numerous lawsuits and legal challenges contending that they are misclassified as independent contractors, rather than as employees," Rosenblat noted.
The reality is, all of the workers in the gig economy do, in fact, report to a manager, albeit an algorithmic one that "enacts a series of rules that Uber has set for how drivers should behave on the job."
As researchers Alexandra Mateescu and Aiha Ngyuen explain in a paper called Algorithmic Management in the Workplace, "ride-hail platforms exert 'continuous, soft surveillance' through data collection of drivers' behaviours, which may be fed into automated performance reports."
They note that while drivers have the freedom to log in or out of work at will, once they're online, the algorithmic manager is monitoring everything they do — tracking their movements, acceleration, braking habits, working hours and more, all through their phones.
These invisible managers also "nudge workers to behave in ways that benefit the company," said Karina Vold, a Digital Charter Fellow at the Alan Turing Institute, coaxing workers to put up with intolerable passengers or keep driving when they want to log off for the day.
When your boss is an algorithm
Rosenblat has spent countless hours talking to Uber drivers for her research, and reported that many of them say having an algorithm as a manager is better than having a real boss.
That is, until things go awry.
"The original feeling of independence from a boss feels empowering, but it starts to be problematic when there is unfairness or something goes wrong and there is no recourse," Rosenblat said.
For example, say a driver has just given a ride to a passenger who was hostile, aggressive or overtly racist, and is worried that person will give her a low star rating, pulling down her average and in turn jeopardizing her livelihood. There is no one to turn to.
Indeed, said Rosenblat, "for a long time, when drivers needed help or had concerns, there wasn't even a phone number they could call."
Just as there's no co-worker or manager to turn to with concerns, there's no recourse when the algorithm "deactivates" your account — or, to put it into more human terms, fires you.
By claiming to be technology companies, as opposed to, say, taxi companies or delivery services, these apps are able to side-step regulatory oversight and employment laws. As such, drivers can be let go without recourse.
The humans in charge
Just because there are no humans to report to doesn't mean there aren't humans higher up the chain of command. Algorithms might be monitoring workers, but executives and engineers are designing those algorithms, and cashing in.
As these companies come under increasing scrutiny for their questionable practices, their use of algorithms is often used as a means of shifting the onus from who is really in charge.
DoorDash, a food delivery app, has recently been under fire for stating that 100 per cent of the tip customers pay goes to the driver, while omitting the fact that the tip is used to supplement the guaranteed pay requirement for that delivery.
The controversy arose from the fact that when a customer added a tip for the driver, DoorDash would put it toward the base rate of pay for that delivery, instead of on top of that fee, as is customary for tipping. In so doing, the tip wasn't really a tip for the driver — just a way for DoorDash to pay them less out of their own pockets.
DoorDash executive Tony Xu apologized on Twitter, stating that moving forward, they would "ensure that Dashers' earnings will increase by the exact amount a customer tips on every order," and that they had "built a pay model to prioritize transparency, consistency of earnings and to ensure all customers get their food as fast as possible."
Rosenblat said it's the lack of transparency that causes confusion in scenarios like this one in the first place.
Indeed, because of people's tendency to think of algorithms as being more fair or objective than humans, she said they can be used to obscure power and control. One example is Uber's surge pricing, in which users are charged a premium depending on the location, time of day or other factors, supposedly assessed by the algorithm.
"They use the algorithm as a means of saying it is objective, but really, it's just price discrepancy," said Rosenblat.
"The algorithm is used as an excuse, and this is the case across the entire gig economy."