• Buffalox@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    6 months ago

    Apparently due to silicon degradation.

    It’s funny, because I have never experienced that, and I’ve always OCed and over volted my CPU’s. My current CPU is a 7 year old Ryzen R5 1600, that I have been running with both OC and higher voltage too. Every CPU I’ve had for 40 years now, has been replaced because they became obsolete. I’ve worked as an IT consultant for 10+ years in the 90’s and 00’s, and NEVER experienced silicon degradation. All sort of other problems, like faulty soldering when led became illegal, and capacitors when fake poor quality Chinese capacitors found their way into production. There is no way silicon degradation should be an issue within a short time span of a couple of years.

    Anyways as it looks now it doesn’t seem like a good idea to buy Intel.

    • Mesophar@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      6 months ago

      These are specifically about 13th and 14th gen Intel processors, so I don’t know if a Ryzen from 7 years ago is related comparison. However, no, it isn’t a good idea to buy Intel at this time.

      • Buffalox@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        6 months ago

        I know it’s not the same process, but I’ve been hearing about silicon degradation for at least 2 decades now, but I’ve never seen evidence that it’s actually a thing.
        By the way, I also have an Amiga 500 from late 80’s, that is still working! If silicon degradation was actually a thing, how is that even possible? Obviously they can’t last forever, but for sure they have always been able to last way beyond the point where they become obsolete.
        Airports sometimes still use equipment from the 70’s and 80’s too. So I doubt degradation is actually a thing, even though modern processes may not be able to last for 60 years, I maintain that degradation in just a couple of years should be impossible.

        • ferret@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          Modern CPUs have transistors that are orders of magnitude smaller than the ones in your Amiga, and there is a direct correlation between transistor size and how much abuse they can take. Additionally, it only happens when the device is on (and for modern CPUs, not so much when idle, much more when the device is turbo-ing and actually at the high voltages). You can expect silicon degradation to become a thing you actually need to worry about as CPU feature sizes continue to shrink. It will probably never get to the point where they degrade faster than they become obsolete, though. (a dramatic reduction in cpu improvement cadence might do it)

    • Robin@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      I’ve never felt silicon degradation on a CPU. But I can say that I’ve had a GPU with a stable overclock for years that started getting a bit flakey and I had to go back to stock settings. Of course for GPUs there are also more frequent driver updates. Maybe that effect was due to the driver and games also trying to squeeze more out of the hardware.