“The new device is built from arrays of resistive random-access memory (RRAM) cells… The team was able to combine the speed of analog computation with the accuracy normally associated with digital processing. Crucially, the chip was manufactured using a commercial production process, meaning it could potentially be mass-produced.”

Article is based on this paper: https://www.nature.com/articles/s41928-025-01477-0

  • TWeaK@lemmy.today
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    10 hours ago

    Okay, I’m starting to think this article doesn’t really know what it’s talking about…

    For most of modern computing history, however, analog technology has been written off as an impractical alternative to digital processors. This is because analog systems rely on continuous physical signals to process information — for example, a voltage or electric current. These are much more difficult to control precisely than the two stable states (1 and 0) that digital computers have to work with.

    1 and 0 are in fact representative of voltages in digital computers. Typically, on a standard IBM PC, you have 3.3V, 5V and 12V, also negative voltages of these levels, and a 0 will be a representation of zero volts while a 1 will be one of those specified voltages. When you look at the actual voltage waveforms, it isn’t really digital but analogue, with a transient wave as the voltage changes from 0 to 1 and vice versa. It’s not really a solid square step, but a slope that passes a pickup or dropoff before reaching the nominal voltage level. So a digital computer is basically the same as how they’re describing an analogue computer.

    I’m sure there is something different and novel about this study, but the article doesn’t seem to have a clue what that is.

    • rowinxavier@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      To be clear though, the two defined states are separated by a voltage gap, so either it is on or off regardless of how on or how off. For example, if the off is 0V and the on is 5V then 4V is neither of those but will be either considered as on. So if it is above thecriticam threshold it is on and therefore represents a 1, otherwise it is a 0.

      An analogue computer would be able to use all of the variable voltage range. This means that instead of having a whole bunch of gates working together to represent a number the voltage could be higher or lower. Something that takes 64 bits could be a single voltage. That would mean more processing in the same space and much less actual computation required.

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 hours ago

      Normal one and zero transistors can hold their state for a while only needing refresh cycles at intervals.
      Seems logical to me that it’s harder to hold values of greater variance, which is probably also why everything works with binary systems, and not a single vendor has chips that use bits with for instance 3 or 4 states.
      What would be most obvious if this wasn’t a problem would be to make a decimal based computer. There’s a reason we don’t have that, except by using 4 bits wasting 6 values, which is very wasteful.

    • themachinestops@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      9 hours ago

      This is an analog pc: https://en.wikipedia.org/wiki/Analog_computer

      https://en.wikipedia.org/wiki/Vacuum-tube_computer

      It does seem to be talking about this, analog doesn’t from my understanding use 1 or 0 as a representation. It is true that the cpu uses voltage as you stated, but what differentiates it from analog is that in analog the volatge isn’t represented as 0 or 1 and is used as is in calculations.

      They are not programmed, they are physically made to preform the calculation from my understanding, like for example the https://en.wikipedia.org/wiki/Antikythera_mechanism