First, they restricted code search without logging in so I’m using sourcegraph But now, I cant even view discussions or wiki without logging in.

It was a nice run

  • TootSweet@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I moved all my open source projects to Gitlab the day Microsoft announced they were acquiring Github.

    (I wish in retrospect I’d taken the time to research and decide on the right host. I likely would have gone to Codeberg instead of Gitlab had I done so. But Gitlab’s still better than Github. And I don’t really know for sure that Codeberg was even around back when Microsoft acquired Github.)

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Honestly for selfhosters, I can’t recommend enough setting up an instance of Gitea. You’ll be very happy hosting your code and such there, then just replicate it to github or something if you want it on the big platforms.

  • Omega_Haxors@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    The writing was on the wall when they established a generative AI using everyone’s code and of course without asking anyone for permission.

    • Elise@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      It’s an interesting debate isn’t it? Does AI transform something free into something that’s not? Or does it simply study the code?

      • Omega_Haxors@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 months ago

        There’s no debate. LLMs are plagiarism with extra steps. They take data (usually illegally) wholesale and then launder it.

        A lot of people have been doing research into the ethics of these systems and that’s more or less what they found. The reason why they’re black boxes is precisely the reason we all suspected; they were made that way because if they weren’t we’d all see them for what they are.

        • AnonStoleMyPants@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          The reason they’re black boxes is because that’s how LLMs work. Nothing new here, neural networks have been basically black boxes for a long time.

  • mozz@mbin.grits.dev
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I’m still stuck on why I have to create a password-equivalent API token, and then store it on my hard drive if I want an at-all-convenient workflow.

    “We made it more secure!”

    “How is storing it on my hard drive more secure”

    “Just have it expire after a week!”

    “How is it more secure now, seems like now there are two points of failure in the system, and anyway I keep hearing about security problems in github which this hasn’t been a solution to any of them”

    “SHUT UP THAT’S HOW”

    • ISometimesAdmin@the.coolest.zone
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      An API token is more secure than a password by virtue of it not needing to be typed in by a human. Phishing, writing down passwords, and the fact that API tokens can have restricted scopes all make them more secure.

      Expiration on its own doesn’t make it more secure, but it can if it’s in the context of loading the token onto a system that you might lose track of/not have access to in the future.

      Individual API tokens can also be revoked without revoking all of them, unlike a password where changing it means you have to re-login everywhere.

      And that’s just the tip of the iceberg. Lmk if you have questions, though.