The fact that this has been replicated is amazing!

  • Adeptfuckup@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    7
    ·
    1 year ago

    Hitch your tits and pucker up. We’re entering a new age of industry. Much like the original Industrial Revolution, technology is going to advance at an extremely rapid pace. Fusion, quantum computing supremacy. Just… wow. How far off is general AI with this new room temperature superconductor?

    • Dr. Dabbles@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      3
      ·
      1 year ago

      Fusion is no closer than ever before, and AGI is hilariously over hyped. Also no closer than ever before.

      • Proweruser@feddit.de
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        And Fusion is pretty close to begin with. Commonwealth Fusion is well within their purpose time table so far. They don’t need any new superconductors for their project.

    • Yondoza@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Stupid question probably - is computing power what is holding back general AI? I’ve not heard that.

      • Dr. Dabbles@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        1 year ago

        What’s holding back AGI is a complete lack of progress toward anything like intelligence. What we have now isn’t intelligent, it’s multi-variable probability.

        • JGrffn@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 year ago

          It’s not that it’s not intelligent, it’s that predictive language models are obviously just one piece of the puzzle, and we’re going to need all the pieces to get to AGI. It’s looking incredibly doable if we figured out how to make something that’s dumb but sounds smarter than most of us already. We just need to connect it to other models that handle other things better.

      • knotthatone@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        Simply throwing computing power at the existing models won’t get us general AI. It will let us develop bigger and more complex models, but there’s no guarantee that’ll get us closer to the real thing.

        • MüThyme@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          There is still heat generated by the act of computation itself, unless you use something like reversible computing but I don’t believe there’s any current way to do that.

          And even then, superconducting semiconductors are still going to be some ways off. We could have superconductors for the next decade in power transmission and still have virtually no changes to processesors. I don’t doubt that we will eventually do something close to what you describe, but I’d say it’s easily a long way off still. We’ll probably only be seeing cheaper versions of things that already use superconductors, like MRI machines.

            • MüThyme@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              I appreciate you revising your reply to be less harsh, I wasn’t aiming to correct you on anything I was just offering some thoughts, I find this stuff interesting and like to chat about it. I’m sorry if I made your day worse, I hope things improve.

              I said superconducting semiconductors as just a handy wavy way to refer to logic gates/transistors in general. I’m aware that those terms are mutually exclusive, but thats on me, I should have quoted to indicate it as a loose analogy or something.

              The only thing I disagree with is your assessment that computation doesn’t create heat, it does. Albeit an entirely negligble amount, due to the fact that traditional computation involves deleting information, which necessarily causes an increase in entropy, heat is created. It’s called Landauer’s principle. It’s an extremely small proportion compared to resistive loss and the like, but it’s there none the less. You could pretty much deal with it by just absorbing the heat into a housing or something. We can of course, design architectures that don’t delete information but I’m reasonably confident we don’t have anything ready to go.

              All I really meant to say is that while we can theoretically create superconducting classical computers, a room temperature superconductor would mostly still be used to replace current superconductors, removing the need for liquid helium or nitrogen cooling. Computing will take a long time to sort out, there’s a fair bit of ground to make up yet.

                • MüThyme@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  edit-2
                  1 year ago

                  I think “rounding error” is probably the closest term I can think of. A quick back of the envelope estimation says erasing 1 byte at 1GHz will increase an average silicon wafer 1K° in ~10 years, that’s hilariously lower than I’m used to these things turning out to be, but I’m normally doing relativistic stuff so it’s not really fair to assume they’ll be even remotely similar.

        • Yondoza@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Really appreciate the write up! I didn’t know the computing power required!

          Another stupid question (if you don’t mind) - adding superconductors to GPUs doesn’t really se like it would make a huge difference on the heat generation. Sure, some of the heat generated is through trace resistance, but the overwhelming majority is the switching losses of the transistors which will not be effected by superconductor technology. Are we assuming these superconductors will be able to replace semiconductors too? Where are these CPU/GPU efficiencies coming from?

            • Yondoza@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Semiconductors are used for transistors because they give us the ability to electrically control whether they conduct or resist electrical current. I don’t know what mechanism you’d use to do that with superconductors. I agree you don’t ‘have’ to have resistance in order to achieve this functionality, but at this time semiconductors or mechanical relays are the only ways we have to do that. My focus is not in semiconductor / IC design either so I may by way off base, but I don’t know of a mechanism that would allow superconductors to function as transistors (or “electrically controlled electrical connections”), but I really hope I’m wrong!