I've recently started writing my own HDL as an embedded language within Idris. The idea behind it is that you can take advantage of the rich type system and you have the option to prove theorems about your circuits to gain confidence in them. There are also bindings to picosat, so that you can automatically prove some properties. Here's a snippet of how it looks so far (though things are changing rapidly):
lt : BoolOperation op => Vect n v -> Vect n v -> SSA op v v
lt [] [] = constOp False
lt (x :: xs) (y :: ys) = do
nX <- notOp x
t1 <- andOp [nX, y]
t2 <- xnorOp x y
t3 <- lt xs ys
t4 <- andOp [t2, t3]
orOp [t1, t4]
I haven't touched the code since I graduated last year, and I've had various ideas about how I should restructure it since then. The most interesting part of the project is probably the implementation of guarded channels, which turned out to admit useful Functor and Applicative instances.
I hope to start as a PhD student next year working on a new version of the language taking into account what I've learned since then. Most notably, GHC's recursive do-notation sounds like it will be incredibly useful for creating the kind of monadic cyclic graph structures needed for hardware.
That's just combinational logic, how do you implement clocked processes and latches?
In my (admittedly limited) experience with hardware design making correct pure combinational logic isn't generally too difficult, it's keeping track of the state and state transitions that's problematic.
I'm glad you asked. At the moment, only combinational circuits are possible, but I'm working to extend this to synchronous (clocked) circuits.
When you need feedback, there will be a combinator that will look something like this:
feedback : Vect n Bool -> (Vect i v -> Vect n v -> SSA op v (Vect n v, Vect o v)) -> Vect i v -> SSA op v (Vect o v)
This is all up in the air. I'm still working it through.
I'm very keen to end up with something that it nice to use, even at the expense of being difficult to implement.
I may be getting ahead of myself, but I'd like to have functions that do things like: take a combinational circuit and return a synchronous circuit where the logic is evenly separated (in terms of delay) over n clock cycles.
There is so much potential in open source hardware design.
Right now, we are held back by (at best) mediocre tools. We need better HDL software, better synthesis software, better EDA software before we have any hope of widespread use of open source hardware.
Imagine if the only compilers available were proprietary compilers locked behind hundreds of pages of licensing requirements and legal bullshit. Open source software would still be in the dark ages.
I applaud any effort to advance the open source hardware design toolset.
As a reminder, Stallmans's first attempt at GCC used the Pastel compiler source code from LLNL. http://gcc.gnu.org/wiki/History
> Hoping to avoid the need to write the whole compiler myself, I obtained the source code for the Pastel compiler, which was a multi-platform compiler developed at Lawrence Livermore Lab. It supported, and was written in, an extended version of Pascal, designed to be a system-programming language. I added a C front end, and began porting it to the Motorola 68000 computer.
I presume that the Pastel compiler therefore didn't require "hundreds of pages ..." in order to use and redistribute it. Or if it did, it wasn't binding on those who acquired the source.
It did Pastel, which is related to Pascal but isn't actually Pascal.
I picked it as an example of a compiler where others could distribute the source. Since you want to narrow it to C compilers, I'll instead point to the proprietary Aztec C compiler. It didn't come "locked behind hundreds of pages of licensing requirements and legal bullshit".
In fact, as far as I can tell, it had no licensing requirements outside of it being covered by copyright. Certainly there are fewer requirements than the GPL.
Here's a list of C compilers for micros of the 1980s. http://www.z80.eu/c-compiler.html . In a spot check, I can't find any which have hundreds (or even tens) of of pages licensing requirements.
Do you have any evidence to support your earlier statement? I've noticed that people sometimes emphasize Stallman's impact, but do so more out of ignorance of the other threads of history. How do I know that you aren't similarly inclined, given that it seems to be different than the written accounts from that time?
There is so much potential in open source hardware design
I wish, but no, there isn't. The problem is that even if you could create an amazing design, there is nowhere to run it. FPGA? Expensive and slow. ASICs? Out of the question.
There is a reason why software is eating the world. Everyone has a cpu handy. And even if everyone also had an FPGA attached, it wouldn't matter. Show me a killer [hw] app.
Not all hardware is designing digital integrated logic. Manufacturing medium-complexity PCB boards (6+ layers, BGAs, etc.) is within hobbyist budgets (and certainly Kickstarters, thrifty startups, etc.) but the open and low-end design tools are not up to to the task. For example, I'm not aware of an open design with Eagle or free tools that has a fast, modern DRAM interface. Professional tools are extremely expensive (10-100K), but certainly not more complex than, say, a modern C++ compiler like gcc or LLVM/clang.
I think this is where FPGAs might prove very useful. Even if the designs made on FPGAs are small-scale, amateurish or sub-optimal today, they will help develop an ecosystem of tools, including open-source tools, that will ultimately help larger-scale open-source hardware.
The problem with FPGAs is the design software. Have you ever tried to use WebPACK ISE or Quartus? Both of them blow. Terrible UI, terrible design, terrible licensing requirements.
We need simple, easy, powerful, and free tools for FPGA development before open source FPGA cores take off.
But I am suggesting an alternative to your second point. When FPGA boards become available to more people, the tools will automatically come. Note that, the open-source tools can be built by software professionals, even though they might be amateurs in hardware design.
Thee's already cheap ($50 and less) FPGA boards. But all FPGAs are proprietary, there's no available documentation on the internals, the bitstream, and many other things one need to create 3.party/open source tools.
And I also suspect they're hard to impossible to reverse engineer.
It seems there is no synthesis (which is to be expected without vendor support).
With 'limited' conversion to VHDL/Verilog, I wonder how useful this is for actual implementation.
Also from the site I couldn't find any mention of higher level functionality such as generics, etc.
MyHDL has been and is being used for production designs.
For parametrizability, you have the full Python power at your disposal. This is also true for conversion, because conversion happens after elaboration of the design.
SystemC is the C++ equivalent that's been around for a long time. For a project I was working on, I used SystemC just for system-level modelling, it didn't bother me that I couldn't synthesize. To achieve decent clock-rates it would have been a good idea to reimplement the whole thing in "native" Verilog anyway (though my project didn't get that far). What SystemC provided me with is a way to sanity-check my model at the "transaction" level.
Pretty cool to see this, and going forward as a roadmap there is much they could borrow from SystemC.
MyHDL is implemented as a minimalistic pure Python library, which basically means that you can use Python features for modeling - pretty powerful. In addition, it supports conversion of a language subset, so that you can also use it as a synthesizable RTL language. The latter feature is absent from SystemC.
MyHDL is not bad for quick prototyping. The main problem I had was using vendor-supplied libraries in simulation, I needed to completely model the vendor block (timing and input-output) which is quite error prone.
OTOH, one of the nice things is the ease of moving from a numpy model of the system to RTL. And using PyPy for running simulations really makes MyHDL hum along quickly [0]
I don't know, i've been working with VHDL for 5 years and i don't see how this could be useful.
Usually we first develop our algorithms using fixed point math at Matlab, which is a piece of cake.
After doing this, developing our VHDL based on Matlab implementation is very straight forward.
Python language isn't a very common knowledge for people developing hardware, so they really need to learn a complete new language made to another domain (software) in order to do the same thing!
VHDL/Verilog has been used for 30 years and in my opinion will continue to be used for many mores, Python's language structure just doesn't have what's needed to the hardware domain.
I see myself wasting a lot of time having to teach our old hardware guys Python and how to think in terms of RTL with a new Language. That's why i didn't try to switch.
Remember, most of the engineers working with hardware come from EE background, not CS. They are more familliar with Schematics than C.
This is extremely useful for people with CS backgrounds who want to get into CS. My first dive into VHDL was disasterous, I feel like it would have went much better if I had used something like this with a familiar syntax.
Very happy to see both the projects (MyHDL and Chisel). I am not the target audience, but I was once interested in building some thing like this in Scala.
What is the uptake of Chisel in the target audience? I have observed that hardware designers that are not familiar with general programming language concepts don't appreciate the benefits of type-safe and object-oriented languages. Heck, even the software developer community is divided over the subject.