• GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    14
    ·
    9 months ago

    Here’s a wild idea: make them publish the exact criteria and formulae used to determine coverage. Their decisions should be verifiable and reproducible.

    This isn’t rocket science.

  • CaptainPedantic@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    9 months ago

    Who needs “AI” when the simple algorithm they already use works perfectly well?

    while 1==1:
        deny_coverage = True
    
    • bartolomeo@suppo.fi
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      I hate that you are absolutely right.

      Medical directors do not see any patient records or put their medical judgment to use, said former company employees familiar with the system. Instead, a computer does the work. A Cigna algorithm flags mismatches between diagnoses and what the company considers acceptable tests and procedures for those ailments. Company doctors then sign off on the denials in batches, according to interviews with former employees who spoke on condition of anonymity.

      “We literally click and submit,” one former Cigna doctor said. “It takes all of 10 seconds to do 50 at a time.”

  • Tolstoshev@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 months ago

    AI will deny the care after being rubber stamped by a doctor who graduated last in his class and this is the only job he can get, being a traitor for the insurance companies.

  • just_change_it@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 months ago

    Yeah, sure, ok. We pinky promise not to use AI to generate leads that are then printed out on paper and put in front of a doctor’s assistant’s autopen for signatures denying insurance or coverage.

    There is absolutely ZERO way to practically enforce this. An AI team can act like a black box, ingesting data and outputting hard copies that cannot be traced back to them. There is no way this will not happen.

    “We’ll audit the company!” -> they’ll send the data to an offshore shell company that doesn’t follow the law, then the recommendations will be sent back.

    Prove that legislation can stop this, just try.

  • pineapplepizza@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    I am not from the US but it baffles me how someone can be cut off from health care in a supposed first world country.