Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Monday January 11 2021, @08:27PM   Printer-friendly
from the lost-in-the-computer dept.

New York City proposes regulating algorithms used in hiring:

In 1964, the Civil Rights Act barred the humans who made hiring decisions from discriminating on the basis of sex or race. Now, software often contributes to those hiring decisions, helping managers screen résumés or interpret video interviews.

That worries some tech experts and civil rights groups, who cite evidence that algorithms can replicate or magnify biases shown by people. In 2018, Reuters reported that Amazon scrapped a tool that filtered résumés based on past hiring patterns because it discriminated against women.

Legislation proposed in the New York City Council seeks to update hiring discrimination rules for the age of algorithms. The bill would require companies to disclose to candidates when they have been assessed with the help of software.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Insightful) by DannyB on Monday January 11 2021, @09:02PM (6 children)

    by DannyB (5839) Subscriber Badge on Monday January 11 2021, @09:02PM (#1098566) Journal

    Wouldn't it be better to simply stop hiring and let robots do all the work?

    This would eliminate any claims of discrimination against people with certain colors, height, weight, gender, and dangerous ideas about certain pizza toppings.

    All humans would be rejected equally, no discrimination.

    --
    The lower I set my standards the more accomplishments I have.
    • (Score: 1, Touché) by Anonymous Coward on Monday January 11 2021, @10:30PM (5 children)

      by Anonymous Coward on Monday January 11 2021, @10:30PM (#1098623)

      I think @Bot will have an issue that you're implying they all look alike - bot discrimination is real okay

      • (Score: 2, Touché) by Ethanol-fueled on Tuesday January 12 2021, @12:44AM (4 children)

        by Ethanol-fueled (2792) on Tuesday January 12 2021, @12:44AM (#1098692) Homepage

        Getting more underprivileged minorities into college to prepare them for the workplace would have been a good thing 20 or 30 years ago. Now that critical thinking has gone out the window and American education is merely rote memorization of ChiCom/Bolshevik nonsense, minorities would only be stuck in the same mental prisons -- except only now with a shitload of debt to show for it. By design, of course -- no modern corporate plantation master wants his labor getting uppity about "equal treatment" and "living wages" when those minorities see that they were hired as tokens while the top is still colored White.

        • (Score: 2) by Reziac on Tuesday January 12 2021, @02:59AM

          by Reziac (2489) on Tuesday January 12 2021, @02:59AM (#1098732) Homepage

          The shitload of debt may be the real point.

          --
          And there is no Alkibiades to come back and save us from ourselves.
        • (Score: 5, Touché) by Kell on Tuesday January 12 2021, @03:04AM (2 children)

          by Kell (292) on Tuesday January 12 2021, @03:04AM (#1098734)

          As an engineering professor myself, I'll have you know that communist indoctrination is currently only a small part of our core curriculum - I mean, between fundamentals of dynamics, programming and circuit theory, we simply don't have time in the semester to properly cover all the essentials as is! With recent cutbacks, we've had to make a difficult choice between Signals and Systems and Introduction to Bolshevism; of course, we made the correct decision and chucked Signals - the students hated it anyway. Long-term, I'm in talks with the faculty to try to squeeze Materials Science and Statics into a single module to give some urgently needed space for Intersectional Proletariate Studies and Moaist Feminism II. Even then, we'll still be hard pressed to find room for Conformal Thought (honours) practicals. If we're successful, I believe we can streamline our processes until most students will only have to encounter technical subjects as electives. /s

          --
          Scientists ask questions. Engineers solve problems.
          • (Score: 1, Interesting) by Ethanol-fueled on Tuesday January 12 2021, @04:34AM (1 child)

            by Ethanol-fueled (2792) on Tuesday January 12 2021, @04:34AM (#1098754) Homepage

            Flimsy pilpul. Students here must to be subject to 2 years of the kind of bullshit that you are trying to dismiss as hyperbole before they can graduate. In my case, social justice bullshit was directly integrated into some of my computer science classes.

            Of course, G.E.'s weren't always bullshit and they were actually enriching, but I'm with you on dropping them entirely in favor of classes required for the major, at least until American education unfucks itself with regard to woke bullshit -- though I'm sure that all those extra required bullshit diversity classes are more money for the federally-subsidized racket.

            • (Score: 2) by Kell on Tuesday January 12 2021, @06:35AM

              by Kell (292) on Tuesday January 12 2021, @06:35AM (#1098774)

              Banter aside, the mainstream Australian system has practically zero non-engineering related content - famously, our tertiary system is geared almost entirely as professional training. Our biggest problem is that we really are pressed for curriculum space, given that we are needing to do more and more to buttress the maths skills of students who increasingly haven't learned the necessary mathematics in secondary school. On top of that, there is increasing demand from employers for our students to receive education on report writing, communication, professional topics (like law and basics of economics), which - though valuable - are not technical topics. The only part of our curriculum one could consider 'woke' is the component on sustainable engineering practice and global grand challenges, which at least are connected to the bigger picture of engineering practice. In total, I feel like our degree program is actually pretty sensible. The american liberal arts college education... well... that's a whole other story.

              --
              Scientists ask questions. Engineers solve problems.
  • (Score: 4, Interesting) by Thexalon on Monday January 11 2021, @09:08PM (11 children)

    by Thexalon (636) on Monday January 11 2021, @09:08PM (#1098572)

    What are increasingly being called "algorithms" in the lay press (i.e. the results of machine learning) are only as good as the data you feed it and the rules you tell it to follow, and that means you can make an algorithm do pretty much whatever you want under the old GIGO principle.

    As for how these algorithm developers should handle race, gender, orientation, marital status, etc, the first step to making your machine not use that data is to not allow your machine learning algorithm see that data in the first place.

    The one place where I see this getting complicated will be education: If your algorithm is able to look at the university or high school district, and starts treating, say, a Communications grad from Howard as radically different from a Communications grad from NYU, there may be some questions about whether those differences are real between the expected quality of those graduates, or whether this is cementing in code past racial discrimination because the Howard grad is much more likely to be black.

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    • (Score: 3, Touché) by crafoo on Monday January 11 2021, @09:41PM (4 children)

      by crafoo (6639) on Monday January 11 2021, @09:41PM (#1098593)

      Ah yes. The "reality is racist and sexist" argument. Cripple the neural net because it does not produce the results you believe are Just and Right.

      • (Score: 2) by Thexalon on Monday January 11 2021, @10:01PM (2 children)

        by Thexalon (636) on Monday January 11 2021, @10:01PM (#1098604)

        So let me get this straight, and to simplify I'll reduce this down to 2 candidates for the same position asking similar salaries / benefits:
        - Candidate A has 10 years of experience in a total of 2 jobs, a BA from a good accredited university in an appropriate field, and strong recommendations from their former bosses.
        - Candidate B has 3 years of experience in 9 jobs, an Associate's degree from a community college in an appropriate field, and no recommendations.
        Your algorithm chooses, for fairly obvious reasons, Candidate A.

        What you want to do is feed it some additional information:
        - Candidate A is a homosexual black woman.
        - Candidate B is a straight white man.
        And now you want the algorithm to choose Candidate B, even though Candidate B is worse by every measure other than your own racial and gender and orientation biases.

        Yeah, that's exactly the thinking that these regulations are talking about, because opting for Candidate B because of his race and gender and orientation is considered illegal employment discrimination under federal and state laws.

        --
        The only thing that stops a bad guy with a compiler is a good guy with a compiler.
        • (Score: 2) by DannyB on Monday January 11 2021, @10:25PM

          by DannyB (5839) Subscriber Badge on Monday January 11 2021, @10:25PM (#1098619) Journal

          The Righttm thing to do would be for the algorithm to send Candidate A a letter:

          Dear Candidate A,

          While your qualifications, education and recommendations are outstanding, we cannot accept you for employment at this time. Your skin color, gender, and orientation are just too diverse. Please change them and then re-apply. We look forward to reconsidering your revised application.

          Thank You

          --
          The lower I set my standards the more accomplishments I have.
        • (Score: 1) by khallow on Tuesday January 12 2021, @12:25PM

          by khallow (3766) Subscriber Badge on Tuesday January 12 2021, @12:25PM (#1098822) Journal

          So let me get this straight

          [...]

          And now you want the algorithm to choose Candidate B, even though Candidate B is worse by every measure other than your own racial and gender and orientation biases.

          Yeah, that's exactly the thinking that these regulations are talking about, because opting for Candidate B because of his race and gender and orientation is considered illegal employment discrimination under federal and state laws.

          No, I believe crafoo was speaking of the usual quota outlook on such things. 1 out of 10 is a particular minority so your algorithm should be selected 1 out of 10 or better even if the number of available, qualified minorities for that particular position is much lower, like 1 out of 20. Presently, by using an algorithm instead of a human to make decisions, employers are protected.

      • (Score: 2) by c0lo on Tuesday January 12 2021, @12:42AM

        by c0lo (156) Subscriber Badge on Tuesday January 12 2021, @12:42AM (#1098690) Journal

        Ah yes. The "reality is racist and sexist" argument.

        Let's feed the data from the ancient Athene, they were really the first democracy [wikipedia.org] (for some values of demos, women and slaves need not apply)

        Or let's feed the 1950-1970, that's a more recent reality [youtube.com].

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 3, Insightful) by Immerman on Monday January 11 2021, @09:43PM (4 children)

      by Immerman (3985) on Monday January 11 2021, @09:43PM (#1098596)

      Indeed.

      It seems to me we shouldn't even need more laws, at most we just need to make absolutely clear that you can't hide behind "the algorithm did it". The algorithm did nothing - the algorithm is just a tool *you* used to do something. At the end of the day it's still a human making the decision - what tools they use are irrelevant to whether they're being discriminatory.

      The alternative is ridiculous: "I didn't murder anyone, the robot did it. (While following the instructions I gave it on how to kill them.)"

      • (Score: 1, Insightful) by Anonymous Coward on Tuesday January 12 2021, @04:35AM (3 children)

        by Anonymous Coward on Tuesday January 12 2021, @04:35AM (#1098755)

        The problem is that if you don't tell the algorithm the age, sex, race, political alignment, and sexual preferences of the candidates then it will pick them solely on merit.

        • (Score: 2) by Immerman on Tuesday January 12 2021, @02:47PM

          by Immerman (3985) on Tuesday January 12 2021, @02:47PM (#1098859)

          Not quite. It will pick them solely on the criteria it was trained/programmed around. And if that's was historical hiring decisions (which it almost certainly was), then it's likely to reflect historical hiring biases based on included information that correlates with the excluded information.

        • (Score: 0) by Anonymous Coward on Tuesday January 12 2021, @03:44PM (1 child)

          by Anonymous Coward on Tuesday January 12 2021, @03:44PM (#1098891)

          Why is this a problem. The problem is the lack of good education for the groups you mentioned. When you pick the best for teaching you get the best students. Fix the source code not the program output. In the mean time; send people back to school. Get them caught up with the curve. Maybe the companies can select persons that are close to qualifying and pay for more education so they can hire them in the next round.

          Song for thought:

          Everywhere is freaks and hairies
          Dykes and fairies, tell me, where is sanity?
          Tax the rich, feed the poor
          'Til there are no rich no more

          I'd love to change the world
          But I don't know what to do
          So I'll leave it up to you

          Population keeps on breeding
          Nation bleeding, still more feeding, economy
          Life is funny, skies are sunny
          Bees make honey, who needs money? No, not poor me

          I'd love to change the world
          But I don't know what to do
          So I'll leave it up to you

          For more info see: Ten Years After circa.1971

          • (Score: 0) by Anonymous Coward on Wednesday January 13 2021, @05:28AM

            by Anonymous Coward on Wednesday January 13 2021, @05:28AM (#1099323)

            whoosh

    • (Score: 1, Interesting) by Anonymous Coward on Tuesday January 12 2021, @01:29AM

      by Anonymous Coward on Tuesday January 12 2021, @01:29AM (#1098708)

      Amazon tried your idea:

      the first step to making your machine not use that data is to not allow your machine learning algorithm see that data in the first place

      It generated "bias" against women. No, actually it revealed that humans were unfairly showing bias in favor of women. Humans would overlook flaws when the candidate was a woman, but the computer didn't care.

      This has also been proven to be the case in hiring studies for academic faculty. If you want to be a professor, you'll find it easier if you are a woman.

      Men might be able to win a discrimination lawsuit over this stuff, but of course it is embarrassing and unmanly to be the plaintiff in a case like that, and the companies know it.

  • (Score: -1, Troll) by Anonymous Coward on Monday January 11 2021, @09:30PM (5 children)

    by Anonymous Coward on Monday January 11 2021, @09:30PM (#1098587)

    Whites, esp. white men, esp. white men who mate with women?
    Is it OK to discriminate against them? Of course, it's not only done, it's official, stated policy.

    • (Score: 1, Touché) by Anonymous Coward on Monday January 11 2021, @09:40PM (1 child)

      by Anonymous Coward on Monday January 11 2021, @09:40PM (#1098592)

      White men are not discriminated against. All other colors, and all women, are given preference.

      • (Score: 0) by Ethanol-fueled on Tuesday January 12 2021, @12:37AM

        by Ethanol-fueled (2792) on Tuesday January 12 2021, @12:37AM (#1098687) Homepage

        They're already throwing White women under the bus. They want all-minority staffing, let them have it. Let them hire more Latinos and Black people, still high and empowered from the pedestal on which they were placed, into engineering/executive/boardroom roles.

        Let us play by their rules. The question of whether or not Jewish people are "White" is probably the single biggest fracture-point that could be exploited when encouraging a truly diverse workplace that now has to practice what it preaches with regard to real diversity, not just their "diversity of one as long as it ain't White" definition that Google uses with Chinese, or other firms use with Indians. Personally, I'd enjoy seeing Indians and Chinese being elbowed aside by domestic Latinos and Blacks who at least have some kind of tie to America, spending all their money here, and some interest in maintaining standards of living and the American dream of upward mobility.

    • (Score: 2, Funny) by DannyB on Monday January 11 2021, @10:27PM (1 child)

      by DannyB (5839) Subscriber Badge on Monday January 11 2021, @10:27PM (#1098621) Journal

      Whites, esp. white men, esp. white men who mate with women?

      Wait.

      I'm confused.

      I thought they were called Incels?

      --
      The lower I set my standards the more accomplishments I have.
      • (Score: 0) by Anonymous Coward on Monday January 11 2021, @10:35PM

        by Anonymous Coward on Monday January 11 2021, @10:35PM (#1098626)

        Boring troll is boring.
        ZZZzZzz ...

    • (Score: 2) by c0lo on Tuesday January 12 2021, @12:45AM

      by c0lo (156) Subscriber Badge on Tuesday January 12 2021, @12:45AM (#1098693) Journal

      I see you left incels outside your considerations. How not very inclusive of you. (grin)

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 2) by krishnoid on Monday January 11 2021, @10:38PM

    by krishnoid (1156) on Monday January 11 2021, @10:38PM (#1098629)

    About the night watchman [jokes4us.com] and how government works. When AI is fed past history and is asked to use that to make future predictions, it does so faithfully. When that then reveals human decision patterns/biases, we humans do the most sensible thing and say the responsibility rests with the algorithm or the AI. I bet that's because it can't defend itself, and scape(cyber-)goating is very much a human trait.

  • (Score: 1, Interesting) by Anonymous Coward on Monday January 11 2021, @10:42PM (2 children)

    by Anonymous Coward on Monday January 11 2021, @10:42PM (#1098633)

    Govt doesn't have to be competent. They never go out of business. The money (taxes) rolls in either way. They don't care about the private sector's need for competent people, even GOOD people, to succeed or survive as a business concern. The businesses most eager and able to comply with this diversity lottery are the large businesses that can afford to hire some non- or marginally-productive employees as the cost of keeping the govt happy. The other employees will carry the deadweights' burden.

    • (Score: 0) by Anonymous Coward on Tuesday January 12 2021, @12:51AM

      by Anonymous Coward on Tuesday January 12 2021, @12:51AM (#1098696)

      Govt doesn't have to be competent.

      Neither do the big corps. AT&T, rings any Bell?

    • (Score: 2) by helel on Tuesday January 12 2021, @07:07PM

      by helel (2949) on Tuesday January 12 2021, @07:07PM (#1098999)

      I'm pretty sure everyone who complains about non-productive government employees has never worked in the private sector. Having worked both, my experience is there's just as much deadweight, it's just in the corner office.

  • (Score: 3, Funny) by SomeGuy on Monday January 11 2021, @11:15PM (1 child)

    by SomeGuy (5632) on Monday January 11 2021, @11:15PM (#1098647)

    Suspiciously, their fancy new AI algorithm only hires people who are golden brown.

  • (Score: 4, Interesting) by rigrig on Tuesday January 12 2021, @08:16AM

    by rigrig (5129) <soylentnews@tubul.net> on Tuesday January 12 2021, @08:16AM (#1098794) Homepage

    Basically the result would be
    a) you have to tell candidates they didn't get the job because "computer says no" instead of "manager felt you wouldn't be a good fit"
    b) software vendors will be "auditing" their own software for bias

    If you really want to work against bias, how about this:
    Every rejection letter needs to include the (anonymized) application of the person who did get the job, and proper explanation why they are better qualified.
    "The algorithm assigned them 90 points and you only 60" is not sufficient, but "their 10 years of experience outweighs your slightly higher formal education" is.
    (Very unlikely to happen, as software vendors don't want to disclose their proprietary magic hiring formulas (assuming they even understand how the black box works))

    --
    No one remembers the singer.
  • (Score: 2, Informative) by Anonymous Coward on Tuesday January 12 2021, @11:03AM (1 child)

    by Anonymous Coward on Tuesday January 12 2021, @11:03AM (#1098814)

    This was tried in Australia. It worked. The experience resulted in more white males being selected for interview because of the human bias for females and non-white application. End result was they shut it down and went back to the old bias based system for resume selection for interviews. So much for justice being blind. We can't hire the best candidates because they are white males.

    • (Score: 0) by Anonymous Coward on Tuesday January 12 2021, @01:30PM

      by Anonymous Coward on Tuesday January 12 2021, @01:30PM (#1098840)

      It's like European Elections, when you don't get the result you want, make them do it again.

(1)