What Attitudes Does the Public Have About Algorithms…

What Attitudes Does the Public Have About Algorithms…

Algorithms represent how we're able to use huge amounts of data and corresponding analytics to make smarter decisions. Algorithms are responsible for those recommendations daily made regarding recipes we might like, books we might find interesting and movies which we'd probably enjoy for example. Algorithms can also do far more high-level tasks such as determining the likelihood that a tumor is cancerous or predicting whether or not someone has criminal tendencies. Recently, the Pew Research Center conducted a study to see where US adults stood as far as their attitudes about algorithms. By and large, it seemed that most of those surveyed were pretty skeptical about such tools as applied to "real life."

A majority, at 58%, felt that there would always be human bias inherent in an algorithm. While forty percent said that they thought such programs could eventually be designed in a bias freeway. Other concerns beyond the bias factor included: privacy violation by algorithms, failure of understanding nuances, and evaluations under unfair conditions.

The researchers involved with the study presented participants with four scenarios in which machines had to gather and analyze large quantities of data. The scenarios were based on real-life situations: personal financial score assessment used to offer deals to a consumer; criminal risk assessment for parole purposes; resume screening; and computer analysis of job interview results. So what did the study uncover about attitudes toward algorithms…

The Public's Concern with Fairness

A substantial number seemed to think that the algorithms were unfair to the people being evaluated. A mere one-third of those asked thought that the tool was fair to the job interviewee and the people whose finance score was being assessed. Almost 70% said that algorithms were thus unacceptable in these scenarios.

The reasons for the algorithms being deemed unacceptable tend to involve the lack of "humanity" associated with the system. The following were cited:

  • Privacy violation. This mainly had to do with the personal finance score example. 26% said that for this reason using the algorithm was unacceptable.

  • Innately unfair. Many worried about the fairness aspect of the system, particularly when it came to the job interview situation and the screening of applicants' resumes.

  • Decisions are no longer "human" based. Almost 40% of the participants surveyed said that especially in the resume screening scenario, the removal of the human factor was worrisome.

  • Lack of capturing nuance. People have complexity—they can make decisions by looking at all associated elements, to include the smaller more nuanced issues involved. An algorithm, however, many argued, lacks this ability to identify nuances. And subsequently, a machine does not have the capacity for personal growth to be able to do so.

Attitudes Often Depend on the Context

While there were consistencies across the board, especially as pertains to the skepticism regarding algorithms, context also played a role in people's attitudes. What type of decision was being made along with the traits of those people affected often influenced participants' answers?

When it comes to criminal risk and personal finance, for instance, the attitudes were a bit different. Over fifty percent felt that algorithms used to determine whether someone was worth a discount or deal would be acceptable. And slightly under fifty percent thought that an algorithm would work well in determining whether or not someone should get parole.

As far as the use of algorithms throughout the various social media platforms, the results were somewhat split. In terms of sites recommending events, movies, books, etc. 75% said they'd be fine sharing personal information to this end. However, 37% would not share such information if it was going to be used to generate politically derived messages.

Age also played a factor as to whether or not respondents would share their info. Those under fifty had less of a problem with it for recommendation purposes. The 65 and older category were more hesitant to share data.

Social Media: Positive and Negative Content

Social media has been fundamentally changed by algorithms. The type of content that we see is shaped by algorithms. What ads we see, what recommendations are made, how we scroll, to an extent, are determined by these algorithms. That said, social media users are not necessarily pleased with what they are presented with. Over seventy percent said the content they see angers them. While sixty percent point out that the posts they get tend to be of an exaggerated nature. 44% of respondents though did say that the content posted was on the amusing side.

The study conducted between May and June of 2018, also uncovered a few other interesting attitudes regarding the human response to algorithms.

  • Race, religion and ethnicity impact perspective. 25% of whites deem algorithms related to personal finance as fair; 45% of blacks, on the other hand, said that such algorithms are fair. Over sixty percent of blacks said that the parole algorithm would not be fair to the person. Just under fifty percent of whites agreed.

  • Approximately 75% of people asked said that social media is not a reflection of public perception in general. Only 25% do say that social media and the corresponding posts reflect societal views and attitudes.

  • Age is a factor as well…younger adults are more often amused by social media posts than older adults. And younger adults are less prone to feel angry because of a post—only 27%.

Many businesses use algorithms like Amazon and Target. It has been useful for them. If you want to incorporate algorithms in your business, but need capital to do so, this is where First Union Lending comes in. We would love to discuss options!

Becky: Hi! Let's find the best loan option for you