Hacker News new | past | comments | ask | show | jobs | submit login
Memristor – The fictional circuit element (arxiv.org)
193 points by godelmachine on Aug 21, 2018 | hide | past | favorite | 86 comments



I went down the rabbit hole.

Best terminal branches I found are:

* HP created something they claimed was a memristor [1]

* there were thermodynamic arguments about why it wouldn't work [2]

* HP discontinued the architecture based on the component [3]

1. https://arstechnica.com/information-technology/2014/06/hp-la...

2. https://arxiv.org/ftp/arxiv/papers/1207/1207.7319.pdf

3. https://www.theregister.co.uk/2016/11/29/hp_labs_delivered_m...


Well...

One thing about the OP paper smells very fishy to me:

About two years ago, an extremely important result about Memristors and the notion of circuit elements in general came out. Unfortunately, it didn't get a lot of attention, and remains not-that-well-known even among experts. But: Anyone making claims like the OP paper would HAVE TO consider it in their model if they want to get taken serious. Here's the paper:

https://hal.archives-ouvertes.fr/hal-01322396/document Abdelouahab, Mohammed-Salah; Lozi, René; Chua, Leon O. - Memfractance: A Mathematical Paradigm for Circuit Elements with Memory [2016]

The fact that the OP paper doesn't even mention Memfractance makes me highly suspicious, and doubly so as the OP paper also brings up a "periodic table" of circuit elements - something the Memfractance paper also does (embedded in ℝ²), but uh... let's say with a lot more behind it. In fact, I'll quote the entire "Conclusions" section of the Memfractance paper here:

>In this paper, we have used fractional calculus in order to generalize and provide a mathematical frame for circuit elements with memory: memfractance. We have emphasized that the memfractance is a general paradigm for unifying and enlarging the family of memristive, memcapacitive and memin ductive elements. The motivation and significance of this paper is that there may exist future nano-electronics devices that are more realistically modeled with memfractance elements

>We have generalized the definition of fractance which was first introduced in 1983, and after that, introduced the paradigm of memfractance which is fitted for circuit elements with memory such as memristor, meminductor, memcapacitor and second-order memristor first introduced here. We have defined a new element called memfractor which possesses interpolated characteristics between those four circuit elements.

>We have then generalized Ohm’s law to memfractor and proved it. A particular, albeit wide-ranging, case of memfractance: the interpolated memfractance has been carefully studied through several numerical illustrative examples. Special attention has been devoted to the interpolated char- acteristic of a memfractor lying between memristor and memcapacitor which exhibits an unexpected new behavior of time variation of flux (φ–t curve). This phenomenon has been studied very carefully by the means of rigorous proofs.

>Finally, following Chua’s recent work in which an infinite discrete family of circuit elements: the (α,β) element is introduced in the scope of developing a rigorous mathematical theory of nonlinear circuits, we extend the previous generalized Ohm’s law in order to embed memfractors elements into this periodic table. For this aim, we define an infinite continued family of circuit elements including circuit elements with memory (such as second-order memcapacitor and meminductor and third-order memristor), with a special metric. We call this family: fractional circuit element family.

Note: That last passage is a reference to this specific paper:

https://www.semanticscholar.org/paper/The-Fourth-Element-Chu... - Chua, Leon O. - The Fourth Element [2012]

Which the OP paper also doesn't cite.

To me, this looks like a FUD campaign by Intel.

I also briefly glanced over the OP paper besides checking for citations in it, and found this passage:

"We have borrowed the expression “periodic table” as attributed to Chua" - which cites a secondary resource from 2008, before Chua's two above papers. Looking for any words starting with, or containing a fragment of, "fract" in the paper also yields no results.

Looks like the typical outcome of someone who thinks they don't suck at literature research, but does suck at literature research, which, quite honestly, the vast majority of scientists do. Ask a randomly sampled researcher if they ever heard the term "Main path analysis" and they'll stare at you like a deer staring into headlights.

Having said that:

The paper does seem to bring up a few valid points, e.g. people making errors & drawing bad conclusions from mislabeled diagrams. This isn't an issue in Chua's work, however.

Oh, and: Here, some slides on memfractance:

http://slideplayer.com/slide/3869533/

Edit:

Actually, this got generalized even further in early 2017:

https://www.sciencedirect.com/science/article/pii/S100757041... Machado, J.Tenreiro; Galhano, Alexandra M. - Generalized two-port elements


>>In this paper, we have used fractional calculus in order to generalize and provide a mathematical frame for circuit elements with memory: memfractance. We have emphasized that the memfractance is a general paradigm for unifying and enlarging the family of memristive, memcapacitive and memin ductive elements. The motivation and significance of this paper is that there may exist future nano-electronics devices that are more realistically modeled with memfractance elements

That sounds like a hoax to me. I had to look up fractional calculus, but it appears to be a real thing. It still sounds ridiculous and appears to bring in memcapitive and meminductive elements to their odd new world. At the end of the day, writing papers with strange terms isn't nearly as useful (to the rest of us) as actually producing a new device.


Here's a well-explained article about fractional derivatives and integrals (1). One freaky outcome is that the factorials of fractional numbers can be found. (Example: negative one half factorial = square root of pi.) It can be applied to all things quantum mechanical (2) so there's a fractional Schrödinger equation etc. I had to look it up, but the links are worth the bother.

(1) https://www.mathpages.com/home/kmath616/kmath616.htm (2) https://arxiv.org/abs/1009.5533

https://www.mathpages.com/home/kmath616/kmath616.htm


> That sounds like a hoax to me. I had to look up fractional calculus, but it appears to be a real thing. It still sounds ridiculous and appears to bring in memcapitive and meminductive elements to their odd new world.

Fractional calculus is a well established tool to develop physical models, particularly rheological and damping models. It makes sense to apply the same technique to model circuit components as rheological and electronic models do share quite a bit of common ground, and in some cases even symbolic representation.


> To me, this looks like a FUD campaign by Intel.

I've heard people suggest Intel might be acting like this due to (HP?) patent issues? (Edit: to be clear this was just interesting musing by an outside observer like the rest of us, not some source with any authority.)


...Really? Cripes. You got a source/link for that?


No, sorry for the confusion (I'll update the comment) -- I think it was just a musing/speculation. However I don't think this was the first person to think of this explanation -- it's been thought of before, e.g. https://news.ycombinator.com/item?id=9963265


Slightly better link:

https://news.ycombinator.com/item?id=9962755

Seems possible that this might indeed constitute part of some ply by Intel, trying to bypass having to pay patent/licensing when actually using their 3D Xpoint tech.


Pragmatic engineer's version: digikey shows 0 search results for "memristor".

(funny enough, doing the same at aliexpress yields a young kid's plastic ball ad, containing "Warning:Memristor")


> (funny enough, doing the same at aliexpress yields a young kid's plastic ball ad, containing "Warning:Memristor")

Happy Fun Ball may contain one or more memristors, and may or may not be sentient as a result. It is recommended to treat Happy Fun Ball with respect as a precaution.


For those among us 30 and younger who are too young to get the joke,

https://en.wikipedia.org/wiki/Happy_Fun_Ball


I went a step further a while back and emailed Digikey to see if they had any insight on memristor availability. This is what I got back:

Thank you for contacting Digi-Key with your request about Memristor parts. While none of Digi-Key official supplies have any existing memristor parts available, there a few companies that have made experimental development chips available for sale.

https://knowm.org/product/bs-af-w-memristors

http://www.bioinspired.net/products-1.html

Seems like general availability of a real memristor is still a ways off - if at all.



Heh, interesting. Not that all the saber-rattling about patent violations particularly makes me want to do business with Knowm.


According to [1], RRAM is a form of memristor*. You can buy these at DigiKey [2]. However, I can't really think of a practical application. It's about 10x slower than (P)SRAM.

[1] - https://en.wikipedia.org/wiki/Resistive_random-access_memory

[2] - https://www.digikey.com/products/en/integrated-circuits-ics/...

> Leon Chua argued that all two-terminal non-volatile memory devices including ReRAM should be considered memristors. Stan Williams of HP Labs also argued that ReRAM was a memristor. However, others challenged this terminology and the applicability of memristor theory to any physically realizable device is open to question. Whether redox-based resistively switching elements (ReRAM) are covered by the current memristor theory is disputed.



At least you can try out a fictional one here:

http://www.falstad.com/circuit/e-mr.html


My favorite fictional-sounding-but-real storage device is the Stringy Floppy!

When I hung out at Radio Shack after school to play with the TRS-80's, some guy came by and showed his off, and I was in awe, and so jealous at how fast it was! I was sure that Stringy Floppies were the wave of the future. You didn't need to rewind them like tapes, because they wound back around!!!

https://en.wikipedia.org/wiki/Exatron_Stringy_Floppy

https://www.youtube.com/watch?v=EBfNy021K2Q

Then there was the Transputer!

https://en.wikipedia.org/wiki/Transputer


The Transputer is a success; a lot of the ideas developed for it are mainstream today, even if we don't call it a transputer.

Kind of like all of the ideas from AI in the 70s and 80s that worked we don't call AI anymore.


> You didn't need to rewind them like tapes, because they wound back around!!!

You physically can't because they spool from the center like an 8-track tape. Probably why they didn't catch on rewinding lets you treat it more like a really slow hard drive.


Fictional or no, this company sells them:

https://knowm.org/memristors/


You can buy them. Sometimes under branding that doesn't say memrister.

You can learn how they work and the science (see the video on https://knowm.org/memristors/)

Are they a fundamental circuit element or one of the existing fundamental elements? We can let scientists debate that.


Yeah, I'm really lost.

I feel like someone described useful properties of A Thing if it existed.

Then we figured out how to make A Thing that had those useful properties.

Is it A Thing, or Two Things?

That's pedantic. Pragmatically, don't we have the useful properties?


Well, in that case was the Mechanical Turk a machine that played chess? I mean, it was a machine and it did play chess. There was a person inside it that actually did all the work, but does that make it not a machine that played chess? We could debate the meaning of "plays" until the cows come home and perhaps arguing that the machine should be able to play chess on its own, or that it doesn't because one component of it is a human being that also functions independently to the machine is pedantic. But perhaps it is productive after all, to challenge and analyse claims regarding the nature of things people make.

If I understand this memristor paper correctly, the whole issue seems to be that what is supposed to be a new kind of Thing, that didn't exist before, is actually made up of already existing Things with well-understood properties in one possible configuration that doesn't do anything radically new, that other configurations can't already do anyway.

So in other ways- we have the useful properties but a) we always had them and b) there's no new Thing.


We did not always have what HP created.

It's not a trick, it just doesn't work on the underlying mechanism we thought it did. The new configuration is itself radically new.

The Mechanical Turk was a trick, designed to deceive.

It's a very unfair comparison.


Don't be too quick to call academics pedantic. You're probably missing some key facts of the argument that make the distinction relevant.


I agree with this. That said, the Wikipedia article has something interesting to point out regarding this [1]:

> Martin Reynolds, an electrical engineering analyst with research outfit Gartner, commented that while HP was being sloppy in calling their device a memristor, critics were being pedantic in saying that it was not a memristor.

[1] https://en.wikipedia.org/wiki/Memristor#Memristor_definition...


> I feel like someone described useful properties of A Thing if it existed. Then we figured out how to make A Thing that had those useful properties.

Apparently that's not quite the situation. I'd read the Wikipedia article. It says the following (keyword: "contrary"):

> Experimental evidence shows that redox-based resistance memory (ReRAM) includes a nanobattery effect that is contrary to Chua's memristor model. This indicates that the memristor theory needs to be extended or corrected to enable accurate ReRAM modeling.

[1] https://en.wikipedia.org/wiki/Memristor#Memristor_definition...


From an academic standpoint, this is all interesting.

From a pragmatic standpoint, they made a thing. And it sounds like the theory they based it on is not as sound as they thought. But they MADE the thing.

I think I would have appreciated a title more like, "Memristor - the devices are cool, but the theory was flawed."

Rather than implying that the devices themselves are fictional.


Theory predicted A, practice made B. Nobody is contesting whether B was made or implying B is fictional. Everyone can see B exists. The debate is whether the claim that B = A is correct, i.e. whether A is fictional or whether A has been found or whether we should keep looking. If all you care about is the present then that's your prerogative, but the people making this stuff you use need to know when they need to stop vs. keep digging for you.


"Memristor – The fictional circuit element" implies B does not exist. It's a lousy title.

Perhaps "Memristor - An interesting device based on a flawed theory" would have been better.

The debate I'm having is whether the language used was inappropriate, i.e. whether the best words were used or if we should keep looking for better words. If all they care about is clicks then that's their prerogative, but the people reading these words need to know what they're supposed to mean.

Because "fictional" has a specific meaning, and this isn't it.


> "Memristor – The fictional circuit element" implies B does not exist. It's a lousy title.

No, it implies A does not exist, not B. Memristor = A. The term existed before HP's B came along.

Whether or not you believe A = B, somehow I think you're smart enough to realize that nobody is claiming B—which everyone can see with their own eyes—does not exist, and yet that's what you're still arguing.


The top comment:

> * HP created something they claimed was a memristor [1]

> * there were thermodynamic arguments about why it wouldn't work [2]

> * HP discontinued the architecture based on the component [3]

Even as a relatively informed person, the headline and the top comment made me believe "a thing with these properties won't work in the real world."

What does "fictional" mean to you?


Cool, so the top comment by a fellow HNer like yourself confused you, so you blamed your confusion on the author of the arXiv article.

That's your fault for ignoring the links [1] [2] [3] in that comment and reading 3 bullet points as 1 coherent sentence. Those are very obviously 3 different sentences from 3 entirely separate articles. If it's ambiguous for you what part of the passage from [1] the word "it" from passage [2] refers to (and feel free to blame the author of the comment if it makes you feel better), you're supposed to click on the links... that's why they were provided to you. They're there precisely to clear up any confusion or inconsistencies.

And if you click the links and search for "thermodynamic", you see it's explained pretty unambiguously, and the ambiguity was introduced in the top comment:

> There is some controversy over whether what Williams developed is actually a memristor because the concept of a memristor itself is seen by some as a violation of the laws of non-equilibrium thermodynamics.

Clearly you can see the "it" that some people claim to be violating thermodynamics (and hence fictional, unreal, or whatever you want to call it) is the concept of a memristor that has existed since decades ago, not the physical thing HP created long after -- that would be preposterous.


So, it's not "misleading clickbait," as long as you're almost an expert in the topic, and you research all of the links?

OR. Maybe the word "fictional" shouldn't have been used?


> A nonlinear ϕ-q curve will always have a positive, albeit variable slope. However, the ratio of ϕ to q still has the units of ohm, without a phase shift, making this a nonlinear resistor.

I think this is false. A nonlinear ϕ-q curve will yield a resistance which depends on time in a manner that can't be described by a single nonlinear v-i curve.


Heard on grape vine that IBM claim to have pulled first silicon for a memristor RPU (design like below) just over a month ago.

https://www.nature.com/articles/s41586-018-0180-5


i seem to remember HP having made these a number of years ago


This article from Ars Technica talks a little bit about memristors and HP's plans for "The Machine". The original plan was for The Machine to demonstrate this technology, but HP ended up removing feature after feature (including memristor technology) before it's release. They quote John Sontag from HP in the article:

"The simplest way to think about it is this—take a DRAM DIMM out, and put a memristor DIMM in,” said Sontag. “You now have another pool of memory that’s denser and nonvolatile. It’s a new class of memory—the consequence for operating systems is that moving stuff around from I/O devices [to and from disk] becomes unnecessary."

https://arstechnica.com/information-technology/2014/06/hp-la...


Doesn't magnetic core memory (which has been around forever) do similar things?


The density is terrible and the read process is destructive.


Both seem to apply equally well for memristors. In both theory and practice.


HP's lab (Stanley Williams) made their memristor announcement 10 years ago.

"The missing memristor found" https://www.nature.com/articles/nature06932

and still working on it "Capacitive neural network with neuro-transistors" https://www.nature.com/articles/s41467-018-05677-5


https://www.hpcwire.com/2015/06/11/hp-removes-memristors-fro...

But what they watered it down to (and not delivered on that either) seems to be roughly equivalent to that fancy flash memory you can buy now for the new intel chipsets.


It's unclear to me if the problem was a lack of vision on the part of HP or if the technology they came up with has fundamental problems that prevent it from being practical. Either one seems plausible.


Don't know either. But if you google for something like - memristor oxygen transfer problem - lots of stuff comes up.


I dream of a bright Future where my PC will have no ram, hdd or ssd but only memristor memory. No need for booting or shutdown, I just add power and everything will be the same as it was when last used.

Since I first read about them years ago this is still what I am waiting for even if it does not seem to be at grasp.


What you're asking for is a fast, cheap, and persistent memory.

It doesn't matter in the slightest whether it's a memristor or some other tech.

And we can approximate this right now with a capacitor to dump out RAM upon power loss. The reason you can't get the experience you want is software, not hardware.


This is theoretically around the corner; Intel and Micron created a memory technology called 3D XPoint that is persistent and is supposed to have latency close to that of DRAM.

Intel just started shipping DIMM modules, though apparently CPU support still hasn't arrived yet: https://www.anandtech.com/show/12828/intel-launches-optane-d... https://www.theregister.co.uk/2018/08/10/optane_dimm_ceremon...


You've got to be able to reboot for when program bugs leave your memory in an invalid state.


Rebooting does not help when your files are in an invalid state. (This may be relevant to software environments which persist the state of the entire "world.")


Any reason this can't be done with flash? I wonder what the speed of a flash-RAM computer would be. I'm sure it'd be slow but it'd be interesting. You'd still have to dump and restore the CPU registers of course.


I think the main issues are that it would be crazy slow, and flash isn't really random access for write— you still have to blank whole pages at a time, so you'd definitely still want a RAM caching layer in front of the flash, even if it was being synced on a second by second basis.

Plus the whole wear levelling thing would have a huge impact on lifespan.

Tiny microcontrollers would be the first ones to benefit from an all-flash architecture, but they all still ship with both flash and RAM onboard.


NAND flash isn't random access even for reading, but generally these limitations are not inherent in the technology, but are motivated by reducing the die area per bit of storage (and thus cost). Purely random access "Flash" is typically called EEPROM and readily available with SRAM-compatible interface (and costs several times more than SRAM).

In the microcontroller space there is for example TI's MSP430FR series which has unified non-volatile storage based on what could be described as core-memory on chip.


Such things were possible with core memory, bubble memory, and FRAM. None are really fast enough by today's standards, though you can get some microcontrollers and battery-backed SRAM replacements made with FRAM.


May end up happening with persistent memory--which may or may not replace current DRAM depending upon who you ask.


We had that: it's called core memory.


tldr + spoilers:

there doesn't seem to be a mechanism for memristors to exist without significant nonlinearity or being a fully passive device, so it's kind of hard to say it's a fundamental circuit element


more like memeristor


I disagree with their claim that current is the time derivative of charge.


Are you being as pedantic as to ask for "through an area"? Because that's literally the defined relationship between current and charge.


The amount of charge in an area stays constant because of Kirckhoff's laws, so dQ/dt=0 always. Wouldn't a better definition be:

  I = q v . A
where q is charge density. A is an area (actually, a normal vector to some flat cross-section, with magnitude equal to size of area), and v is velocity of the charged particles. I'm using the dot product.

If we check the units: (coulombs * metres^{-3}) * (metres * seconds ^{-1}) * metres^{2} = coulombs / seconds

By the way, I fudged the above to make "charge density" have units $coulombs * metres^{-3}$. I'm not the best person at physics.


> The amount of charge in an area stays constant because of Kirckhoff's laws, so dQ/dt=0 always.

Eh what? No. There seems to be some huge misunderstanding here. First of all, charge is not "in an area". It's in volumes, and flows through areas. And the amount of charge in a volume is absolutely not constant (in general). If you think it is, then please explain what happens when you rub a balloon on your hair, or why/when lightning strikes.


I was referring to Kirchhoff's first law which only applies to circuits -- that example is enough to show that I=dQ/dt is wrong. I wasn't considering the cases you're talking about (lightning etc.) because only one example is needed to prove an equation wrong. I could easily produce a single specific example where dQ/dt=0 but there is clearly a non-zero current present.

And you can talk about charge in an area (sort of) if you multiply charge density by area; the result has units coulombs * metre^{-1}. I thought counting charge in a cross-section of wire was something people did, one way or another, even if the units aren't Coulombs but Coulombs*metre^{-1}.

I could very easily have misunderstood something. With the clarifications above, I'm not clear on what though.


Okay, that's much more nuanced than "dQ/dt = 0 always".


This isn't mere pedantry, there is a difference between current and the time derivative of charge, I don't mind that they prefer to use one or the other, but if you insist on making a periodic table then you've got to take both into account.

It's also an especially bad idea to claim the time derivative of current is the the double derivative of charge, given that these are two very different notions of derivative.

To clarify, claiming current is the derivative of charge is like claiming the current of a river is the derivative of it's water level.


Current is unequivocally the time derivative of charge. There're no two ways about it.

i(t) = dQ(t) / dt

"To clarify, claiming current is the derivative of charge is like claiming the current of a river is the derivative of it's water level." This statement is wrong, while water in some cases provides a decent analogy for electricity, this is not one of them.


This is what happens when someone with a clear idea of a lumped element model slams into someone with a clear idea of Maxwell's equations.


That equation is unequivocally true...under certain assumptions.

Contravariant is simply saying that it's being stated and used as if it holds generally; but that's not the case.

This thread quickly dismissed 'through an area' as pedantic, but as far as I can tell that really is the crux of the debate: this simplified form of the equation only holds for scalar flux - viz. constant over and perpendicularly through a fixed flat surface.

In the generalised form, `I` is a function of the area at a point.


If I have a piece of wire with a constant current through it the charge everywhere on the wire remains constant. In fact the charge will be more or less neutral in most cases.

Note that the alternative would be that the wire somehow accumulates charge as it passes current, which clearly isn't happening.


No, you are wrong. Electrical current is the mathematical representation of a charged particle moving between two points. The value of the current is directly related to the amount of charge being moved.

As for your "wire" example, you're conveniently ignoring that a single (open-ended) wire isn't a closed circuit and as such there can be no current (and hence no accumulating charge). And if it's a closed circuit, then there are other elements in the circuit that consume or generate charge.


They are not wrong. In the Maxwell equations both come in as fundamentally different terms.

The claim is that current density J is different from the time rate of change of charge density dρ/dt.

That is not to say they are unrelated; they are related by the continuity equation,

    dρ/dt = -∇·J.
The distinction is real, because what you are calling current in the one case is actually a spatial derivative of current, as indicated by the ∇.

I would actually go a step further than this and say that current is actually properly defined as the source of magnetic field. On the conventional definition of current, it is physically impossible for current to flow through a capacitor, but we speak of that all the time. So the True Current Density is just

    J + ε dE/dt
in SI units. Actually taking that seriously, however, does require to committing to language which sometimes seems a little awkward, like saying electromagnetic radiation involves an AC current oscillation that propagates through empty space transverse to its oscillation.


> I would actually go a step further than this and say that current is actually properly defined as the source of magnetic field.

That's actually (still, and somewhat) how the Ampere is defined. There are ongoing efforts to change this though.

I... prefer to avoid discussions like this one, but I thought you might appreciate this part :-).


Fine, charge a circular piece of wire, place it in space and spin it around. Voila, current without any accumulation of charge.

Honestly I'm confused why it's turning out to be such a controversial point that there's a difference between a change in charge and the movement of charges.

Heck in their diagram they claim that the voltage across an inductor is the double derivative of its charge, but inductors can't even hold charge so it's unclear what they're trying to say. They also claim that the charge across a resistor is somehow related to the integral of the voltage across it, but again (ideal) resistors can't really hold charge. The only way to interpret this supposedly 'universal' periodic system is by interpreting charge and its derivatives in different ways depending on context, which isn't convincing in a supposedly universal system.


It's ok, if I didn't have a Master's in the field I would probably have similarly down voted you.

The problem is mainly that the criticism you are making is not great for pedagogy. What is being called “charge” is probably something like “disposition to accumulate charge” or so, in the same way that force is not actually mass times acceleration, but it's mass times a disposition to accelerate, so that you can do things like measure my weight-force even though I’m not falling through the floor.

The dispositional truth of the matter is fundamentally more cognitively complex to teach than the simple rule that you get when you say that everything does what it's disposed to do, and so everybody has memorized the version of the definitions that has no dispositions, and gets very confused when you point out that aspect of those definitions.


I suppose I also didn't provide an awful lot of explanation to my point, but I figured I could just explain when asked. I didn't expect the difference between current and a change in charge to be this controversial.


Except there has been no controversy at all. Everybody understands that current (i. e. flow of electric charge, water, etc.) may have nothing whatsoever to do with "change in charge" (or in the mass of water) contained in a volume of space through which charge or water flows.


If what's being called "charge" here is charge distribution between one side of the element and the other side within the circuit (basically, electrons to the left minus electrons to the right, divided by two lest we count a single moving electron twice), and what's being called "current" here is simply current across the element, then in this case, i = dq/dt.

That's a good mathematical model of the behavior of the elements from a pedagogical perspective, though confusing for more advanced readers.


I believe there is another case, very similar (also involving a closed loop): If you pass a (properly oriented) magnet through or even across the opening of such a loop, an electrical current will flow around the loop (while the rest of the atomic bits will stay more or less in the same place ;) ) and no charge will accumulate.

(This caught my attention because I had up to that point believed that electrical fields are always conservative, and this demonstrates that not to be true: an electron traveling around the loop in such a manner will eventually return to its starting point having done non-zero work)


They are both correct: I = dq/dt is true only for scalar flux - a fixed, flat, surface with constant charge applied perpendicularly at every point across it. A popular simplification/special case of the general equation from vector calculus.


"charge everywhere on the wire remains constant" - perhaps. But the charge is still flowing, it's still moving at a rate of dQ/t.

What is dQ/t you may ask? It's the time derivative of charge. It's the change in charge over time, the rate of flow of charge.

I=Q/t

Current=Charge/Time

Amps=Coulombs/Second

I don't know how to make this clearer.


> To clarify, claiming current is the derivative of charge is like claiming the current of a river is the derivative of it's water level.

My electro mag skills are a little rusty, but I don’t think that’s the right analogy. You wouldn’t say that the speed of a runner is the derivative of their height, similarly when someone is running you wouldn’t try to say that their speed isn’t a derivative of their height, as their height is more or less constant “for the length of the run”.

Similarly, it isn’t the height of the runner, but the distance they’ve covered, it isn’t the depth of the river, but how much water has passed through, it isn’t the static charge of the wire, but how much charge has passed through.


It's only a little bit off. To get the true hydrodynamic analog to a capacitor, connect the bars of two big pistons together so that volume accumulated in one comes at the expense of the other, then hit the bar with a spring so that its motion comes with some energy cost that can oppose a constant pressure.

The point is that this component alone ties flow to accumulation, whereas generally your other components (resistors=thin pipes, wires=thick pipes, batteries=Archimedes screws, inductors=turbines connected to flywheels) do not accumulate volumes of water inside of them. Flow needs to make sense even without accumulation due to flow-balance, just like force needs to make sense even in situations where velocity stays constant due to force-balance.


I think you're just rephrasing what I'm trying to say in several different ways.

We can probably agree that the amount of charge passing through is different from the change to the total charge.

And in my opinion something like a 'periodic system' can't just treat both the same. It's especially bad to derive identities by taking the time derivative (or anti-derivative) on one side and switching between current and charge on the other.


Good thing that you point this out, it lends more credit to my theory that this entire paper constitutes classical Intel FUD, see my comment here:

https://news.ycombinator.com/item?id=17811314

I suspect that the two papers I linked there will likely make you a lot happier than the OP paper. :)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: