My Lemmy Oracle
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
floofloof@lemmy.ca to Technology@lemmy.mlEnglish · 2 years ago

Generative AI boom "could come to a fairly swift end"

www.dezeen.com

external-link
message-square
87
fedilink
166
external-link

Generative AI boom "could come to a fairly swift end"

www.dezeen.com

floofloof@lemmy.ca to Technology@lemmy.mlEnglish · 2 years ago
message-square
87
fedilink
Dezeen - architecture and design magazine
www.dezeen.com
external-link
The world's most influential architecture, interiors and design magazine
  • hottari@lemmy.ml
    link
    fedilink
    arrow-up
    12
    arrow-down
    2
    ·
    edit-2
    1 year ago

    deleted by creator

    • Peanut@sopuli.xyz
      link
      fedilink
      arrow-up
      3
      ·
      2 years ago

      Reminds me of the article saying open ai is doomed because it can only last about thirty years with its current level of expenditure.

      • hottari@lemmy.ml
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        edit-2
        1 year ago

        deleted by creator

        • diffuselight@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 years ago

          Cost reduction in the field is orders of magnitude potential. Look at llama running on everything down to a raspy pi after 2 months.

          There are massive gains to be made - once we have dedicated hardware for transformers, that’s orders of magnitude more.

          See your phone being able to playback 24h of video but die after 3h of browsing? Dedicated hardware codec support

          • hottari@lemmy.ml
            link
            fedilink
            arrow-up
            1
            arrow-down
            4
            ·
            edit-2
            1 year ago

            deleted by creator

            • diffuselight@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              2 years ago

              The trajectory is such that current L2 70B models are easily beating 3.5 and are approaching GPT4 performance - an A6000 can run them comfortably and this is a few months only after release.

              Nah the trajectory is not in favor of proprietary, especially since they will have to dumb down due to alignment more and more

              https://www.anyscale.com/blog/llama-2-is-about-as-factually-accurate-as-gpt-4-for-summaries-and-is-30x-cheaper?trk=feed_main-feed-card_feed-article-content

              • hottari@lemmy.ml
                link
                fedilink
                arrow-up
                1
                arrow-down
                2
                ·
                edit-2
                1 year ago

                deleted by creator

                • diffuselight@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  2 years ago

                  A 30B model which will be fine for specialized tasks runs on a 3090 or any modern mac today.

                  We are months away from being affordable at current trajectory

                  • hottari@lemmy.ml
                    link
                    fedilink
                    arrow-up
                    1
                    arrow-down
                    2
                    ·
                    edit-2
                    1 year ago

                    deleted by creator

Technology@lemmy.ml

technology@lemmy.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 151 users / day
  • 1.03K users / week
  • 2.36K users / month
  • 8.16K users / 6 months
  • 1 local subscriber
  • 38.8K subscribers
  • 3.26K Posts
  • 41K Comments
  • Modlog
  • mods:
  • MinutePhrase@lemmy.ml
  • BE: 0.19.5
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org