Edit: my first draft was harsher then it needed to be, sorry, long day.
First of all, nobody’s saying this is going to happen overnight. Secondly, traditional computing systems generate heat due to electrical resistance and inefficiencies in semiconducting transistors; the process of computation does not inherently require the generation of heat, nor cause it through some other means than electrical resistance. It’s not magic.
Superconduction and semiconduction are mutually exclusive - it’s in the name. A semiconductor has resistance properties midway between a conductor and an insulator. A superconductor exhibits no electrical resistance at all. A material can be a superconductor in one “direction” and a semiconductor in another, or a semiconductor can be “warped” into being a superconductor, but you can’t have electrons flowing in the same direction with some resistance and no resistance at the same time. There’s either resistance, or there’s not.
Finally, there is absolutely no reason that a transistor has to be made of a semiconducting material. They can be made of superconducting materials, and if they are then there’s no reason they’d generate heat beyond manufacturing defects.
Yes, I’m talking about a perfectly superconducting system and I’m not allowing for inefficiencies where components interface or component imperfections resulting in some small amount of resistance that generates heat; that would be a manufacturing defect and isn’t relevant. And of course this is all theoretical right now anyway; we don’t even know for sure if this is actually a breakthrough yet (even if it’s really beginning to look like it). We need to better understand the material and what applications it’s suited to before we can make concrete predictions on what impacts it will have. But everything I suggest is grounded in the way computer hardware actually works.
I appreciate you revising your reply to be less harsh, I wasn’t aiming to correct you on anything I was just offering some thoughts, I find this stuff interesting and like to chat about it. I’m sorry if I made your day worse, I hope things improve.
I said superconducting semiconductors as just a handy wavy way to refer to logic gates/transistors in general. I’m aware that those terms are mutually exclusive, but thats on me, I should have quoted to indicate it as a loose analogy or something.
The only thing I disagree with is your assessment that computation doesn’t create heat, it does. Albeit an entirely negligble amount, due to the fact that traditional computation involves deleting information, which necessarily causes an increase in entropy, heat is created. It’s called Landauer’s principle. It’s an extremely small proportion compared to resistive loss and the like, but it’s there none the less. You could pretty much deal with it by just absorbing the heat into a housing or something. We can of course, design architectures that don’t delete information but I’m reasonably confident we don’t have anything ready to go.
All I really meant to say is that while we can theoretically create superconducting classical computers, a room temperature superconductor would mostly still be used to replace current superconductors, removing the need for liquid helium or nitrogen cooling. Computing will take a long time to sort out, there’s a fair bit of ground to make up yet.
Okay, you’re kind of reaching with that one 😋 I didn’t mention Landauer’s Principle because it’s so negligible as to be irrelevant (seriously, the heat generated by writing or erasing a bit is about equivalent to the energy levels of a single electron in a hydrogen atom, in the range of ~0.018 eV at room temperature), and superconductors will reduce even that. I kind of wish we had another word, for when “negligible” doesn’t do the insignificance justice.
I do appreciate the clarification on the point of superconducting semiconductors - and the concern for my day haha! It really wasn’t anything to do with you, hence the edit. And, your point here is absolutely correct - LK-99 isn’t some magical material that can be all things to all people. Its other properties may make it unsuitable for use with existing hardware manufacturing techniques or in existing designs, and we may not find superconductors that can fill every role that semiconductors currently occupy.
Edit: lol, looks like its “other properties” include not being a fucking superconductor. Savage.
I think “rounding error” is probably the closest term I can think of. A quick back of the envelope estimation says erasing 1 byte at 1GHz will increase an average silicon wafer 1K° in ~10 years, that’s hilariously lower than I’m used to these things turning out to be, but I’m normally doing relativistic stuff so it’s not really fair to assume they’ll be even remotely similar.
Edit: my first draft was harsher then it needed to be, sorry, long day.
First of all, nobody’s saying this is going to happen overnight. Secondly, traditional computing systems generate heat due to electrical resistance and inefficiencies in semiconducting transistors; the process of computation does not inherently require the generation of heat, nor cause it through some other means than electrical resistance. It’s not magic.
Superconduction and semiconduction are mutually exclusive - it’s in the name. A semiconductor has resistance properties midway between a conductor and an insulator. A superconductor exhibits no electrical resistance at all. A material can be a superconductor in one “direction” and a semiconductor in another, or a semiconductor can be “warped” into being a superconductor, but you can’t have electrons flowing in the same direction with some resistance and no resistance at the same time. There’s either resistance, or there’s not.
Finally, there is absolutely no reason that a transistor has to be made of a semiconducting material. They can be made of superconducting materials, and if they are then there’s no reason they’d generate heat beyond manufacturing defects.
Yes, I’m talking about a perfectly superconducting system and I’m not allowing for inefficiencies where components interface or component imperfections resulting in some small amount of resistance that generates heat; that would be a manufacturing defect and isn’t relevant. And of course this is all theoretical right now anyway; we don’t even know for sure if this is actually a breakthrough yet (even if it’s really beginning to look like it). We need to better understand the material and what applications it’s suited to before we can make concrete predictions on what impacts it will have. But everything I suggest is grounded in the way computer hardware actually works.
I appreciate you revising your reply to be less harsh, I wasn’t aiming to correct you on anything I was just offering some thoughts, I find this stuff interesting and like to chat about it. I’m sorry if I made your day worse, I hope things improve.
I said superconducting semiconductors as just a handy wavy way to refer to logic gates/transistors in general. I’m aware that those terms are mutually exclusive, but thats on me, I should have quoted to indicate it as a loose analogy or something.
The only thing I disagree with is your assessment that computation doesn’t create heat, it does. Albeit an entirely negligble amount, due to the fact that traditional computation involves deleting information, which necessarily causes an increase in entropy, heat is created. It’s called Landauer’s principle. It’s an extremely small proportion compared to resistive loss and the like, but it’s there none the less. You could pretty much deal with it by just absorbing the heat into a housing or something. We can of course, design architectures that don’t delete information but I’m reasonably confident we don’t have anything ready to go.
All I really meant to say is that while we can theoretically create superconducting classical computers, a room temperature superconductor would mostly still be used to replace current superconductors, removing the need for liquid helium or nitrogen cooling. Computing will take a long time to sort out, there’s a fair bit of ground to make up yet.
Okay, you’re kind of reaching with that one 😋 I didn’t mention Landauer’s Principle because it’s so negligible as to be irrelevant (seriously, the heat generated by writing or erasing a bit is about equivalent to the energy levels of a single electron in a hydrogen atom, in the range of ~0.018 eV at room temperature), and superconductors will reduce even that. I kind of wish we had another word, for when “negligible” doesn’t do the insignificance justice.
I do appreciate the clarification on the point of superconducting semiconductors - and the concern for my day haha! It really wasn’t anything to do with you, hence the edit. And, your point here is absolutely correct - LK-99 isn’t some magical material that can be all things to all people. Its other properties may make it unsuitable for use with existing hardware manufacturing techniques or in existing designs, and we may not find superconductors that can fill every role that semiconductors currently occupy.
Edit: lol, looks like its “other properties” include not being a fucking superconductor. Savage.
I think “rounding error” is probably the closest term I can think of. A quick back of the envelope estimation says erasing 1 byte at 1GHz will increase an average silicon wafer 1K° in ~10 years, that’s hilariously lower than I’m used to these things turning out to be, but I’m normally doing relativistic stuff so it’s not really fair to assume they’ll be even remotely similar.