Human Resources Predicts Future
In the Steven Spielberg movie Minority Report, police belonging to a special Pre-crime unit arrest people for crimes they would do in the future. It’s science fiction, and it will probably never happen in our lifetimes.
However, the pre-crime concept is coming very soon to the world of Human Resources (HR) and employee management.
A Santa Barbara, Calif., startup called Social Intelligence data-mines the social networks to help companies decide if they really want to hire you.
While background checks, which mainly look for a criminal record, and even credit checks have become more common, Social Intelligence is the first company that I’m aware of that systematically trolls social networks for evidence of bad character.
Using automation software that slogs through Facebook, Twitter, Flickr, YouTube, LinkedIn, blogs, and “thousands of other sources,” the company develops a report on the “real you” — not the carefully crafted you in your resume. The service is called Social Intelligence Hiring. The company promises a 48-hour turn-around.
Because it’s illegal to consider race, religion, age, sexual orientation and other factors, the company doesn’t include that information in its reports. Humans review the reports to eliminate false positives. And the company uses only publically shared data — it doesn’t “friend” targets to get private posts, for example.
The reports feature a visual snapshot of what kind of person you are, evaluating you in categories like “Poor Judgment,” “Gangs,” “Drugs and Drug Lingo” and “Demonstrating Potentially Violent Behavior.” The company mines for rich nuggets of raw sewage in the form of racy photos, unguarded commentary about drugs and alcohol and much more.
The company also offers a separate Social Intelligence Monitoring service to watch the personal activity of existing employees on an ongoing basis. The service is advertised as a way to enforce company social media policies, but given that criteria are company-defined, it’s not clear whether it’s possible to monitor personal activity.
The service provides real-time notification alerts, so presumably the moment your old college buddy tags an old photo of you naked, drunk and armed on Facebook, the boss gets a text message with a link.
Two aspects of this are worth noting. First, company spokespeople emphasize liability. What happens if one of your employees freaks out, comes to work and starts threatening coworkers with a samurai sword? You’ll be held responsible because all of the signs of such behavior were clear for all to see on public Facebook pages. That’s why you should scan every prospective hire and run continued scans on every existing employee.
In other words, they make the case that now that people use social networks, companies will be expected (by shareholders, etc.) to monitor those services and protect the company from lawsuits, damage to reputation, and other harm. And they’re probably right.
Second, the company provides reporting that deemphasizes specific actions and emphasizes character. It’s less about “what did the employee do” and more about “what kind of person is this employee?”
Because, again, the goal isn’t punishment for past behavior but protection of the company from future behavior.
It’s all about the future.
The Future of Predicting the Future
Predicting future behavior, in fact, is something of a growth industry.
A Cambridge, Mass., company called Recorded Future, which is funded by both Google and the CIA, claims to use its “temporal analytics engine” to predict future events and activities by companies and individual people.
Like Social Intelligence, Recorded Future uses proprietary software to scan all kinds of public web sites, then use some kind of magic pixie dust to find both invisible logical linkages (as opposed to HTML hyperlinks) that lead to likely outcomes. Plug in your search criteria, and the results come in the form of surprisingly accurate future predictions.
Recorded Future is only one of many new approaches to predictive analytics expected to emerge over the next year or two. The ability to crunch data to predict future outcomes will be used increasingly to estimate traffic jams, public unrest, and stock performance. But it will also be used to predict the behavior of employees.
Google revealed last year, for example, that it is developing a search algorithm that can accurately predict which of its employees are most likely to quit. It’s based on a predictive analysis of things like employee reviews and salary histories. They simply turn the software loose on personnel records, then the system spits out a list of the people who are probably going to resign soon. (I’m imagining the results laser-etched on colored wooden balls.)
HR professionals wear many hats, and one of them is crystal ball reader.
All hiring and promotion, and some firing, are based on predictions about the future. They take available data (resumes, interviews, references, background checks, etc.) and advise hiring managers on what kind of asset a person will be in the future. Will they interact well with other employees? Will they be a good manager? Will they keep company secrets? Will they show up on time?
Okay, let’s put this all together. What happens when social networking analysis and predictive analytics are combined for HR goals?
Following the current trend lines, very soon social networking spiders and predictive analytics engines will be working night and day scanning the Internet and using that data to predict what every employee is likely to do in the future. This capability will simply be baked right in to HR software suites.
When the software decides that you’re going to quit, steal company secrets, break the law, post something indecent on a social network or lie on your expense report, the supervising manager will be notified and action will be taken — before you make the predicted transgression.
If you think that’s unlikely, consider the following two facts. First, think about how fast we got to where we are today. Three years ago you had never heard of Twitter and were not a member of Facebook. Today, you could be passed over for a job because of something you or even someone else posted on one of these services.
Second, contrast personnel actions with legal actions. When you stand before the law accused of wrongdoing, you get to face your accuser. You can’t legally be thrown in jail for bad character, poor judgment, or expectations of what you might do in the future. You have to actually break the law, and they have to prove it.
Personnel actions aren’t anything like this. You don’t get to “face your accuser.” You can be passed over for hiring or promotion based on what kind of person you are or what they think you might do in the future. You don’t have to actually violate company rules, and they don’t have to prove it.
When it comes to firing you, the company merely has to weigh the risk of a wrongful termination lawsuit against the risk of your predicted future behavior.
If the social network scanning, predictive analytics software of the future decides that you are going to do something in future that’s inconsistent with the company’s interests, you’re fired.
The practice of using available data to predict the future has always been a big part of HR. But now and increasingly, the tools are becoming monstrously sophisticated, efficient, powerful, far-reaching and invasive.
There’s no way around it: The Minority Report Pre-crime concept is coming to HR.