I'm sure many of them have done interesting things.
But the area I write software on in broadcast has a large number of these engineers holding back progress, most of whom are close to retirement age and now have heavy influence in standards bodies. For uncompressed video over IP for example they propose insanely tight standards such as packets must arrive within ~40us (that's us, not ms) of their allotted time, something very difficult to reach in software. (I will be talking about this at demuxed next week...)
They also insist that legacy artefacts from the analogue era must remain such as VBI stay for another generation to save around 1ms of latency (and of course their precious line buffers).
They are one reason the broadcast industry isn't able to move fast enough compared to the web. As I've explained to them, some of us will have to deal with the mess they have created in 40 years time.
> packets must arrive within ~40us (that's us, not ms) of their allotted time, something very difficult to reach in software. (I will be talking about this at demuxed next week...)
Maybe in Ethernet-land, but this is trivially achievable (actually exceeded by more than an order of magnitude) with off-the-shelf Infiniband gear that is available for peanuts these days. Actually it's close to two orders of magnitude if you want to shell out for the latest gear.
Yes. Adapter latency is guaranteed to be 0.5-5 usec depending on generation (later revisions are faster+lower latency). Switches add about 170 nsec per hop, plus whatever speed-of-light delays your cables add. 1 usec end-to-end is realistic for modern gear.
Of course with IP-over-IB you also need to factor in whatever latency is added from your network stack, but Infiniband also supports RDMA so IB-aware applications can essentially bypass this for high-performance or hard-real-time usecases.
Basically, Infiniband takes the network out of the equation here. The bottleneck then becomes how precisely you can time the software.
It's basically a meme referencing the "good old days" of traditional gender expression. It's usually used with some amount of irony or sarcasm, but occasionally in earnest. I think the classic phrase was "when men were gentlemen and women were ladies", but in geek circles it's probably better known from Douglas Adams's riff on it:
"In those days spirits were brave, the stakes were high, men were real men, women were real women, and small furry creatures from Alpha Centauri were real small furry creatures from Alpha Centauri."
edit: it also occurs to me that "real men" was probably somewhat (re-)popularized by the 1982 book "Real Men Don't Eat Quiche", which was poking fun at rigid and nonsensical concepts of masculinity, but a lot of people use the pattern of "real men don't X" very seriously.
There was also the internet variant of "The Internet: where men are men, women are men and children are FBI agents" that was doing the rounds since at least the mid-90's.
I thought it was best known from Linus's "do you long for tho days when men were men and wrote their own device drivers" line (from memory; I may have gotten a word wrong).
Just because it's a popular idiom doesn't mean it isn't an unfortunate one.
Your Douglas Adams example does, however, give an example of its use where women are also acknowledged and we can all go home not stinking of casual sexism for the sake of a catchy idiom.
> in 1989 there wasn’t a computer that could do this in near real time
Does this hold even if we consider the fastest computers, like Crays? A '89 Y-MP could do 2.7 GFLOPS or about 333 FLOPS per NTSC "pixel" in each NTSC field. Smaller computers of the era might be more economical for the required processing power, maybe one of the Thinking Machines SIMD MPPs?
The demoscener in me disagrees with the statement entirely. ;-)
From the article:
It requires 3 inputs, the first video, the second video being turned to, and a typically solid color/image that represents the back of the page
This effect probably doesn't require that much processing power since all it needs to do is switch (up to) twice on each scanline, from the pixels of A to "back" to B; the switching points themselves can be calculated straightforwardly since they vary linearly. Making the "back" itself be another video doesn't add much extra processing. Here's a visual schematic of one frame halfway through the effect (- = first video, + = second video, b = back):
As a rough guess, it's probably doable in around a dozen extra calculations per scanline, which in NTSC is slightly less than 64us. Assuming (a very generous) 100 clock cycles for those calculations, and hardware which has a video mux so the software doesn't have to copy the pixels, that turns out to require... slightly less than 1.6MHz.
Of course, if your hardware doesn't have a mux and you need to copy the pixels between various buffers, then the requirement goes up around an order of magnitude... but probably still doable on something like a 33MHz 386.
When do you think it was that Silicon Graphics added the capability to have not just one but two video inputs at the same time? That is a crucial feature needed for the page turn. Real time video effects at 50/60 fps broadcast resolution was not easy. The signals would have been component analog, not digital, with everything having to be synchronised.
I doubt that the SGI O2 would have had the chops for this task and that was a latter day model costing proper workstation money, supposedly with the I/O for animation etc.
So why would you spend real money on an SGI as used by moviemakers 'deskside workstation'?
Back then electronics based solutions effectively worked on every pixel in parallel, not a pixel at a time, as per CPU style graphics. So changing things wholesale on an image, e.g. brightness and contrast, is instant. Back then a CPU would need to do a lot of work and could not manage the instant refresh.
In broadcasting back then a 1U rack of kit would cost a lot of money no matter what it was. A clock could cost $100000 and not even tell the time on the front of its rack-mount enclosure. So it would not have been a problem to spend $$$ on another piece of kit.
Fascinating. Anyone providing workshops/courses for complete beginners? Would love to have my team (of UI designers) get to play around in this field for inspiration sake. If you provide this stuff, please ping me (details in my profile)