• Snot Flickerman@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    8 days ago

    I was trying to figure out why they’re making such a big deal about adding it since men veteran suicides still massively outpace women veteran suicides and I stumbled on this note:

    • From 2020 to 2021, the age-adjusted suicide rate increased 6.3% among Veteran men and 24.1% among Veteran women. From 2020 to 2021, the age-adjusted suicide rate increased 4.9% among non-Veteran men and 2.6% among non-Veteran women.

    • In 2021, the age-adjusted suicide rate of Veteran men was 43.4% greater than that of non-Veteran U.S. adult men, and the age-adjusted suicide rate of Veteran women was 166.1% higher than that of non-Veteran U.S. adult women.

    It seems its less about the total and way more about the spike. While men veterans still kill themselves more often, that’s still in-line with what has been happening for decades now and not increasing at a rate inconsistent with non-veteran suicide, while the women veteran suicides are a massive spike of suicides in a short time frame, compared to non-veteran women suicides which only grew a fraction comparatively.

    So, initially confusing, but looking closer, super important to take into account, actually.

    • astropenguin5@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      8 days ago

      Thanks for finding the actual statistics. My initial feeling was that there are too many ways to interpret what the article was saying, and playing silly buggers with the statistics.

    • Notyou@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 days ago

      That is very interesting to me. It is an insane spike for women. Getting a better understanding of this might led to better understanding in general.

  • Jackthelad@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    6
    ·
    8 days ago

    Headline leaning into the culture wars a bit there, isn’t it?

    I’d fully expect a program like this to favour white men given they make up the majority of the armed forces. It’s called maths. Or “math” I guess, in America.

    • GetOffMyLan@programming.dev
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      1
      ·
      8 days ago

      It shouldn’t favour anyone. It should treat each person as an individual and figure out what they need based on their characteristics. If it’s been designed to only work well for white men it’s been designed poorly.

    • sensibilidades@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      8 days ago

      If the algorithm is more likely to help a white servicemember than a black one, that would be a problem, no?

      • Jackthelad@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        3
        ·
        8 days ago

        It would be, but is that what’s happening?

        The article seemed to mention men and women, but nothing to do with race apart from the headline.

        • GetOffMyLan@programming.dev
          link
          fedilink
          English
          arrow-up
          13
          ·
          8 days ago

          after an investigation by The Fuller Project and The Markup found the department’s algorithm prioritized White, male veterans. It also gave preference to veterans who are “divorced and male” and “widowed and male” but not to any group of female veterans.

          • Nindelofocho@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            8 days ago

            Im really curious on how and why it prioritizes it at all. Im probably ignorant to this but shouldn’t it just be if you are a veteran and you need help, you get it? Anything less than that would be inequity ?

            • GetOffMyLan@programming.dev
              link
              fedilink
              English
              arrow-up
              7
              ·
              8 days ago

              I’m guessing they have limited resources for direct intervention so use this to flag up people who have the most risk factors.

              It doesn’t sound like this is people asking for help but more trying to predict who might need it.

    • ContrarianTrail@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      8 days ago

      They also make up the majority of people commiting suicide. Not sure about ‘white’ but men that is.

    • flamingo_pinyata@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 days ago

      It’s such a harmful aspect of American culture not just in this case - they treat White and Black people as fundamentally different and separate, and each must have it’s own kind of healthcare.

  • TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    3
    ·
    edit-2
    8 days ago

    I mean I guess algorithms are technology, but this really seems like it should be a submission in a US politics community.

    At the very least include the US in the title so we know which government’s policy is being criticised.

    • roofuskit@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 days ago

      While I agree that many are too eager to put anything even tangentially related to tech in this community, this belongs. This is a piece of technology that is wildly relevant to this community and to a politics community. Sometimes they are inextricable. More and more these days whether we like it or not.

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        Is a debate about how the US government handles mental health problems amongst veterans really “wildly relevant” to the Technology community on Lemmy?

        Idk, to me it seems like they’re not actually that related at all, beyond them using a database to attempt to figure out who’s more at risk.

        • roofuskit@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 days ago

          Sorry Zek, when technology affects people’s lives it’s technology news. Especially when affects who lives and dies. Making tech treat people equally is only political if someone makes it that way.

          • TheGrandNagus@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            7 days ago

            Of course it’s a political discussion. It’s a discussion of whether the US government should have their system for triaging veteran mental health changed (if what this article states is true, then yeah maybe they should).

            I’m not really sure why something being a matter of politics makes it a bad thing. Why frame something being political as a bad thing?

            Tbh I don’t know how you could frame this as not being a political discussion. The article is about the US government and bills being put to Congress.

            Regardless, it’s only tangentially related to technology, and only relevant to one country. That’s my point.