Jesus' Coming Back

UK Companies Are Now Using AI To Monitor Their Workers Down To The Minute

Companies are known for tracking the “productivity” of their employees sometimes with extreme means. However, the rise of supercomputing and artificial intelligence has allowed this to be amplified many times, as now some businesses are using artificial intelligence to monitor “productivity” of employees down to the minute:

Dozens of UK business owners are using artificial intelligence to scrutinise staff behaviour minute-to-minute by harvesting data on who emails whom and when, who accesses and edits files and who meets whom and when.

The actions of 130,000 people in the UK and abroad are being monitored in real time by the Isaak system, which ranks staff members’ attributes.

Designed by a London company, Status Today, it is the latest example of a trend for using algorithms to manage people, which trade unions fear creates distrust but others predict could reduce the effects of bias.

The system shows bosses how collaborative workers are and whether they are “influencers” or “change-makers”. The computer can compare activity data with qualitative assessments of workers from personnel files or sales performance figures to give managers a detailed picture of how behaviour affects output.

Users so far include five law firms, a training company called Smarter Not Harder and a London estate agency, JBrown, according to Status Today, which promises “real-time insights into each employee and their position within the organisational network”. Workers do not automatically have a right to see the data, which is controlled by the employer.

The insurer Hiscox and the IT firm Cisco have used the system for short-term analysis rather than continuous surveillance, Status Today said.

Critics say such systems risk increasing pressure on workers who fear the judgment of the algorithm, and that it could encourage people not to take breaks or spend time in creative thought that will not be logged.

“If performance targets are being fine-tuned by AI and your progress towards them being measured by AI, that will only multiply the pressure,” said Ursula Huws, a professor of labour and globalisation at the University of Hertfordshire. “People are deemed not to be working if they take their hands off the keyboard for five minutes. But they could be thinking, and that doesn’t get measured. What is this doing for innovation, which needs creative workers?”

She said there were risks to mental health if people did not feel free to take breaks, for example to surf social media for a few minutes or play a game. A survey released this week suggests UK workers tend to procrastinate for on average three hours a day.

The Isaak system has already gathered data on more than 1bn actions, which it uses to pinpoint “central individuals within a network” to better allocate workload and responsibilities, “ultimately improving the overall workplace environment and reducing stress and overworking”.

It is part of what experts have labelled the “precision economy”, in which more and more aspects of life will be measured. The Royal Society of Arts predicts that in the next 15 years, life insurance premiums will be set with data from wearable monitors and workers in retail and hospitality will be tracked for time spent inactive. As gig economy working spreads, people will qualify for the best jobs only with performance and empathy metrics that pass a high threshold. Those with lower scores will have access to only the most menial and sometimes miserable tasks such as content moderation on social media, the RSA has predicted.

Ankur Modi, the chief executive of Status Today, said his system aimed to provide a “wellbeing analysis” and could detect overwork – for example at evenings and weekends. But he admitted: “there’s always a risk that it might be misused”. He agreed it was a legitimate concern that companies could use it only to boost productivity, without focusing on wellbeing.

“If one salesperson is performing well and you can see overwork and another isn’t performing well and isn’t overworked, that could be enough to start a conversation,” he said.

However, he argued that it could help bosses cut out bias and discrimination by removing subjectivity from management decisions.

AI ideas that are being developed elsewhere have included the use of facial recognition software and mood monitoring at work, recording a worker’s location on wearable devices and the monitoring of keyboard strokes. A survey by the Trades Union Congress last year found that a majority of workers were opposed to all of these.

The TUC’s general secretary, Frances O’Grady, said: “Workers want to be trusted to do their jobs. But this kind of high-tech snooping creates fear and distrust. And by undermining morale, it could do businesses more harm than good. Employers should only introduce surveillance technologies after negotiation and agreement with the workforce, including union representatives. There should always be a workplace agreement in place that clarifies where the line is drawn for legitimate use, and that protects the privacy of working people.” (source, source)

This trend connects to the rise of the smart phone and with it, the “social credit” system being employed in China.

The “social credit” system in China works on the principle of tracking all purchases and “actions” as well as “locations” a person visits. It can presumably track more things, such as driving speed and decisions a man makes, since the phones are ostensibly able to monitor a person in “real time.” The result of this is the creation of a “profile” of a man where every action he makes is judged without regard to situation, thought processes (which vary from person to person), and decisions of a personal nature that are then weighted and judged in a mathematical formula through a computer to generate a “result” that places a person into a “good” or “bad” category.

This creates not a situation of “Deus ex machina,” but rather a “Deus in machina” where a man creates a computer into a third-party judge of the worth of another man per its “formula.” As per the system, those with high scores get many social benefits, and those with low scores are denied even things such as housing and transportation. Millions of people are already in the possession of “low” scores, and they say that once one gets a “low” score it can be almost impossible to escape from. It is a science fiction or horror novel brought to real life in a way more sinister than what even the most creative writer could imagine.

This particular system as used in China will not take the same form in the Western world, as it does not suit the cultural idioms necessary for it to be accepted. However, this can be modified with a little assistance, such as this AI program “monitoring” employees at all times.

The way to adapt such a program is to make the results an “indirect” consequence that leads to a “direct” action. It is the same passive-aggressive approach that employers use when firing people from a job in many cases because they do not like somebody. In the case of the above, the artificial intelligence is providing a “metric” of “efficiency” (or another series of like terms with the same basic meaning) that imparts a “value” to an employee. This “value” is then superimposed alongside other employees and is use to “rate” people on who is the “best” worker. Those with the highest rankings are rewarded, while those with the lowest may be punished or fired. Given the increasing automation of the workplace, and the fact that employers have a broad worker pool to choose from, to be fired from one’s job is approaching a death sentence for many because it can long-term destroy a man’s ability to economically provide for his household. In this way, it has the same effect as the Chinese system but uses an indirect series of means to realize the same and nefarious conclusions.

One should be aware of direct “social credit” type systems in the US, but in doing so to look first at the various ideas that would lead to their implementation that attempt to measure, scrutinize, and judge even the smallest of actions or record personal data in a computer of any sort. These systems will be implemented in ways that make the tasks of life more convenient or provide small financial incentives for accepting them. However, the small benefits offered come at the greater loss of surrendering personal data, which is far more valuable, and used to create a “profile” by which a man can be classified, categorized, and judged on his “worth” by a computer system that then determines his place in society.

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More