• Korhaka@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 day ago

    Dumping databases is such a small brain move. That costs what, half a departments overtime for a few days? Nothing.

    Deploying consistently poor quality code costs far more. That can double the size of the support department and keep them going for years. Costing far more to the organisation.

    • notabot@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      12 hours ago

      Yup, don’t dump the database, just shuffle the primary keys on important tables. Systems keep running, so it takes longer to work out what’s wrong, and the data gets even more screwed up with every passing transaction. You very rapidly end up with a basically unrecoverable mess if you’re also talking to external systems, as none of their data will match yours anymore, even if you do try to recover from an older backup.

      • Korhaka@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 hours ago

        Then fuck with a few stored procedures. Swap the data in a few random rows each time it’s called. Scatter changes like that around.