It's a shame RMS made no reference to the current activity around RISC-V http://riscv.org/ a freely implementable instruction set architecture with BSD-licensed reference implementation.
At lowRISC http://lowrisc.org/ we aim to produce a fully open source SoC and produce it in volume. The aim is open source down to the HDL - that is the Verilog (or in this case Chisel) that describes the hardware. There is an extra step of place and routing the design for a specific process, but ultimately this relies on a process design kit for the process in question which comes with stringent NDAs. We are actually fortunate enough to be taking part as a mentoring organisation in Google Summer of Code in collaboration with a number of our friends in the wider open source software and hardware communities, so if you're a student and open hardware interests you there's an opportunity to get paid to contribute over the summer http://www.lowrisc.org/docs/gsoc-2015-ideas/
It was in Wired magazine, I'm sure that he needed to write with a certain level of concision. You might want to email him at rms at gnu dot org ... I've got to imagine he would support your efforts, and FSF has a hardware certification program, so couldn't hurt to try to open a line of communication
It would probably help if RMS understood a little about how inventors of hardware circuitry make a living instead of posting from a position of almost complete ignorance.
Circuit topologies certainly can be patented. "Copyright" is an irrelevant concept here. The IP doesn't reside in a copyable drawing or shape, but in the commercial value of the topology.
What matters isn't the graphical or notional arrangement of the components, but the fact that the arrangement provides a novel and unique solution to a specific problem and/or implements a significant new capability.
Licensees can then pay for the right to use the topology in their own products.
If you want to make your topology free, you can simply release it into the public domain. Prior art rules will then make it very hard to patent.
The various CC/GPL licenses make little or no sense in this context, because most hardware topologies aren't directly copyable without modification and tuning.
Besides, a lot of designs already use standard topologies. There's no patent on many standard digital and analog design elements because they're already in the public domain.
Some of them have been in the public domain for decades.
Either you didn't read the article, or you are deliberately knocking down straw men for a seemingly bizarre reason.
RMS laid out EXACTLY the points you've described; that hardware & circuits can be patented, but the circuit topology cannot be copyrighted.
I think the idea that Dr. Stallman is writing about is a framework for sharing designs and collaborating that has the potential to grow into a thriving free hardware ecosystem.
It seems you may feel threatened by such a development, but for the reasons you outlined, it may not be a threat. I suppose however that having public repositories of innovative hardware designs could make it harder to obtain patents if it turns out that prior art was posted and timestamped on the Internet.
"It is difficult to get a man to understand something, when his salary depends on his not understanding it." - Upton Sinclair
Did you read the original article? There is specifically discussion about how copyright cannot protect the actual topology.
The point is not to remove profitability from hardware. The point is for hardware you purchase to include the full schematic and source used to create that hardware, so that you can easily modify it and create a derivative work if you wish to.
Many, many digital designs do not include high value patented topologies, but are simply complex interconnections of many separate ICs. You can always choose a license without a patent grant too, if you want.
I think that's the opposite of what the GP is saying. The thesis seems to be that hardware is patentable because what matters is not the schematic/topology, but the physically manufactured circuit board, whereas in software you can directly copy an algorithm. It's the "you wouldn't download a car" differentiation.
Hardware designs require prototyping for any significant amount of complexity. Prototyping requires a back and forth costing many thousands of dollars. Obviously RMS didn't read this: http://www.mauve.plus.com/opensourcehw.txt
>Prototyping requires a back and forth costing many thousands of dollars.
This is a problem that is being rapidly solved in the OSHW community, with low cost PCB prototyping services such as OSH Park and similar. Something of the complexity of a laptop motherboard may still be in the thousands of dollars range, but a large variety of hardware is now incredibly affordable.
Even something like a low-cost phone would fall under this. Unless you're making a simple control board for a physical process, OSH Park won't cut it. And of course RMS is talking about computing. Anything at the general computer level is probably not going to work out unless it's really tiny.
Hardware development is well, harder than its software counterpart, true. For one, it's about what that article said about the costs involved, although I don't fully agree on the suggested costs per iteration, as the testing can be mostly done by simulators and the physical implementation can be only a last step in a broader development chain. For other, unlike the software, on the hardware level you are actually into real engineering, forced to deal with the real-world quirks and imperfections. The hardware complexity can not just balloon as the software does. You can not add there "features" for the sake of it (so just to scratch your intellectual itch) without much of a consequence like you do in software, and this makes it boring and a less appealing target for development. When it happens however, you got that described problem about having to read hundreds of data-sheet pages and then deal with both all that hidden complexity and the real world. That being said, with some discipline, the hardware development can be done!
Maybe, just maybe, there are other ways. Musk did it with rockets. I'm not knowledgeable, but very often, long proven techniques and status quo let people stop trying to find alternative paths.
This was an incredibly well written article. I didn't realize who had written it until it referenced the GPLv3. The arguments are convincing enough that I'm going to use GPLv3 rather than the Apache 2.0 license for my future open hardware work.
The thing is, despite RMS touting it of course, most of the GPL is completely irrelevant to hardware designs. Please consider choosing a simpler license. For most design files a CC-BY-SA license will have the same effect and the intention will be much clearer. If you want to make explicit patent grants the Apache is actually a good fit. There is even a modified one aimed at digital fabrication which let's you specify that a certain mark has to be included if copied and fabricated [1].
GPL is good for software, HDL and circuit masks only. I have seen a lot of debate around what one is allowed to do with GPLed PCB footprints. Bit of a pain in the ass really.
They are very similar. Both include an explicit patent grant. However, GPLv3 requires that redistributions keep the original HDL intact, which I prefer. Also, the fact that one of the primary authors considers it a good choice for hardware is also a factor.
The Novena open-source laptop was designed in Altium, so all of its source files are in that format. As a result, someone took up the effort to make an Altium to KiCAD converter: http://www.kosagi.com/forums/viewtopic.php?id=65
People love to complain no matter what you do... let them, they're killing themselves by doing that, not you. Just keep building the free world using the tools you have, and the rest of us will surely appreciate it.
Funny thing is, before the age of throw away electronics, one could get schematics for just about anything. Today I was fixing a Lambda (now TDK) high voltage supply. The service manual has detailed schematics, parts lists, and trouble shooting procedures. All Agilent and Tektronix equipment had service manuals. One could even go to Sears and buy parts to fix the toys they sold.
I don't know if it's really manufacturers trying to protect their IP, or just not wanting to spend resources on publishing schematics and dealing with the questions from the public.
Both of the possibilities that you suggest seem logical to me. I can imagine other dimensions to it as well, not the least of which could be planned obsolescence.
Today's equipment has a lot more embedded devices such as FPGAs and microcontrollers - so it's going to be a lot more difficult to rework these parts and get firmware updates. Some parts might not be programmable in the field (i.e. they need updated before they get soldered onto the board).
So I think the romantic days are gone for a reason.
I think it's related to both of those factors, but also the continued miniaturization of parts, and the shrinking of package sizes. Rework on modern day electronics is infeasible without somewhat expensive tools and a fair amount of skill.
I was being very generous (good microscope, high-end Oki heat gun and soldering iron which professionals would use).
Chinese imports are at the $100 range for the gun and iron. Microscope, probably not unless you can get by with a USB microscope of some form. Although, you can get by with the same stuff that jewelers use (glasses, loupes, etc.)
And, I'm sorry, but good tools for doing anything cost actual money. Anvils, sledgehammers, etc. all cost near $50+ or so.
However, you are significantly underestimating the cost to hack on an Apple II. First, the Apple II was damn expensive--it was almost the price of a new CAR. Chips were way more expensive, and you needed a lot more of them. Edge connector cards and edge connectors were very expensive.
There was a reason why so many people loved the TRS-80 Color Computer for robotics and control applications for many years. It had analog ports you could get at with inexpensive connectors and the connector on the side had almost every digital signal and wasn't ferociously expensive (although, it wasn't cheap. IIRC, prototyping cards for it were in the $70 range in 1981).
Things are WAY better now for the hobbyist even with surface mount technology.
Things used to be much better. Nearly all early computers included complete schematics. My first education about computer hardware came from studying the schematics of the Apple II.
Big, useful manuals scare neophytes. As such, the awesome schematics included on a lot of early systems were removed along with nearly the entirety of the manual because they just added bulk for the majority of users. Also, as boards have gotten more and more integrated, the value of those schematics for someone just getting into hardware has diminished. The boards are now 8+ layers, which means most of the traces are now hidden from view, and the components themselves tend to be little more than a fairly simple power regulator, big huge monolithic chips with hundreds of pins, a few filter capacitors, and a few pull up/down resistors. There's too much "there" there to really be useful for the beginner.
However, this is where the arduino is very useful. It's a small chip, and a small number of components, which would increase the value of having a schematic that people can trace, and keep in headspace what the various lines are doing. More than a bit of me thinks that the ARM chips that are starting to squish the 8/16 bit micro market from above may be useful here. A decent number of pins, decent speed, and something you can hack that does more than blink pretty lights. Perfect for giving a schematic that kids can follow and get to really understand what's going on.
Apple may have felt that they were sufficiently protected by the copyright on their software, including the ROMs. Indeed, Apple defeated a clone maker, Franklin, based on this protection.
Release of the design had an interesting side effect. Since the entire ROM could be disassembled, developers started exploiting un-documented features, for instance by manipulating RAM locations and branching into the middle of routines. A book, What's Where in the Apple II contained hundreds of useful addresses and entry points. As a result, Apple lost control of their API, and could not upgrade the system without breaking popular software titles.
The Apple //e was a heroic effort to preserve software compatibility, but included a bare minimum of new features. I suspect this may have been a factor towards Apple developing their next big thing -- the Mac -- as a closed system.
A similar thing happened to the IBM PC, because developers started hard-coding the known address space of the display RAM, rather than going through system calls, resulting in the famous 640k barrier.
For me the missing link in OSHW is proving the microscopic circuitry we depend on is the same as the design published by the chip designer. It's a form of hashing for circuitry.
Chinese intelligence agencies are convinced NSA pits back doors in their CPUs and HDD, the NSA are convinced the Chinese are doing it to them. And every other intelligence agency just throws it's hands up.
This situation could easily spiral downwards to a Halton of chip improvements, and yet an "institution" such as verifiable open hardware could let everyone trust again.
the solution here is to release the schematics, and the resaulting gate-level netlists. People who want to verify designs can use scan/jtag to verify that designs do waht they claim
I understood that jtag was effectively trusting what the circuit reports back - a bit like asking a binary to tell you it's own md5.
I am interested to know if I am wrong (it's not uncommon) and if scan chains might solve that too.
A different approach might be to fingerprint different areas of the chip under different inputs - so for example when this bus passes over a million zero words, everything connected to it performs in a unique way that would not work if the chip was physically different to it's schematic?
There are fundamental flaws in this idea of trying to treat hardware as software. They are very different animals. A better designation might be "physical products". It seems these discussions tend to ignore the fact that physical products are not simply circuit boards with a handful of chips and code.
As an example, we developed an advanced high power LED light source. The design took a year of engineering. During this time hundreds of LED's and a myriad of circuit technologies and driving approaches were explored. Thermal management alone required six months of iterating through a family of designs. Initial designs were tested and fine tuned with thermal FEA tools and physical test candidates were machined in our CNC shop for real testing.
Past that, some of the advanced approaches we used required designing and building custom manufacturing tools and production test equipment.
Just think about the cost, not just in man-hours but also prototyping, testing, materials, regulatory, etc.
So it is because of this and over 30 years in the hardware manufacturing wold that I say, with all due respect, that the view being put forth is rather myopic and naive. Fine for little Arduino type gizmos and perhaps even some chips but it breaks down very quickly once you start to hit what I am going to call "the real world". This is where reproducing or iterating a design might require an infrastructure in the hundreds of thousands of dollars and potentially deep multidisciplinary capabilities.
In the software world any 15 year old with enough motivation to dive into a large open source project can do so armed with a $200 laptop. They can learn and iterate at practically no cost. Not so in the case of physical products.
Another fundamental fact to consider is the benefits of having open hardware.
When I make software free, it can easily be forked, contributed, and can scale to thousands of contributors. Consider the cost in man-hours to write a patch for Linux.
Now I make my hardware free. How do you submit a patch to a piece of hardware? How many man-hours does it take? How do I test this said patch? How do I accept pull requests? Can a piece of hardware have thousands of contributors? It is definitely significantly less trivial, when compared to software, considering the special equipment one needs to design physical hardware.
A lot of the free software movement revolves around improving software because it is more open. If I release open hardware, I don't see how the same will hold true for free hardware. Open hardware is just a giveaway.
I believe hardware is more akin to art; no one submits a PR to the Mona Lisa.
I've gotten several pull requests to my free hardware design that I sell as my primary source of income. Granted, a PCB change is difficult to test given the minimum order quantity, but other physical designs like cases are easy to do a one-off if you have access to a laser cutter or 3D printer.
As someone mentioned elsewhere in the comments, RMS thinks on a much longer time scale than you and I. Prototyping PCBs is already dramatically easier than it was ten years ago; I'm convinced the cost and MOQ will only continue to drop.
It's also a shame this article made no reference to the open hardware foundation (http://www.oshwa.org) who've been doing an awesome job promoting open hardware.
I feel like this is essential in pushing humanity forward. Once it becomes commonplace to build a business model around OSHW and drive freedom deeper down the stack we will be that much closer to a post-scarcity world.
An interesting article in light of a brief conversation that I had with RMS in the early 2000s, after one of his lectures.
At the time I was involved with Opencores and asked RMS about his thoughts on Free Hardware. He was dismissive, on the basis that it wasn't software, so the four freedoms couldn't apply to it, therefore it was out of scope.
It's great that he has come around! I suspect that he hasn't so much shifted his position, as gained a better understanding of hardware and can now see the applicability. If only I had been able to give a better explanation at the time!
> I suspect that he hasn't so much shifted his position, as gained a better understanding of hardware and can now see the applicability. If only I had been able to give a better explanation at the time!
I suspect it has more to do with the improvements to fabrication technology since then; in particular the dramatically increased accessibility to your everyday hacker.
I have a "You can build my design, but you must send me a picture of the finished device, and let me post it on my blog" license for my stuff. It's surprising how many people don't even bother to do that.
I personally wouldn't take someone to court over this, a slap in the face is sufficient. But if I were to, my understanding is that it's copyright infringment:
At lowRISC http://lowrisc.org/ we aim to produce a fully open source SoC and produce it in volume. The aim is open source down to the HDL - that is the Verilog (or in this case Chisel) that describes the hardware. There is an extra step of place and routing the design for a specific process, but ultimately this relies on a process design kit for the process in question which comes with stringent NDAs. We are actually fortunate enough to be taking part as a mentoring organisation in Google Summer of Code in collaboration with a number of our friends in the wider open source software and hardware communities, so if you're a student and open hardware interests you there's an opportunity to get paid to contribute over the summer http://www.lowrisc.org/docs/gsoc-2015-ideas/