Hacker News new | past | comments | ask | show | jobs | submit | more vodou's comments login

You gotta love that Tcl/Tk node based user interface in Max/FTS - something that is still alive today in Pd (Pure Data). Sure, it is clunky, probably not in line with any modern UX design rules. But, at the same time, it is kind of timeless, brutalistic and stylish in all its monochrome ugliness.


TTK can be themed to match your GTK theme :D.

You have the modernish bling-bling with TCL/TK's rad development.


:%s/ugliness/beauty


Windows Sandbox, together with WSL, have liberated me from VirtualBox/VMware Workstation. So thankful for that. Now I only wait for native USB support in WSL.


In case you don’t know about it, there is good workaround based on USB over IP that is officially recommended by MS.

I used it a while ago to flash a ESP32 and to connect a Zigbee Adapter to a Linux container. Had no issues with it.

https://learn.microsoft.com/en-us/windows/wsl/connect-usb


This is useful on the USB support front: https://learn.microsoft.com/en-us/windows/wsl/connect-usb


Somewhat related: https://en.wikipedia.org/wiki/Automavision

"Automavision is a cinematic technique invented by Danish director Lars von Trier.

Developed with the intention of limiting human influence, in Automavision no cinematographer is actively operating the camera. The best possible fixed camera position is chosen and then a computer chooses framing by randomly tilting, panning or zooming the camera. In doing so it is not uncommon that the actors appear in the shots with a part of their face and head cut from the frame. With this technique then the blame for any "errors" are entirely attributable to a computer."

I always felt this was a bit of joke from Lars von Trier, though. I am not aware of any other film than "The Boss of it All" where it was used. A good joke after all. As von Trier says:

"If you want bad framing, Automavision is the perfect way to do it."


"The Boss of it All" takes place in some kind of IT software firm. I'm pretty sure "automavision" is inspired by the kind of online meeting camera which tries (and often fails) to focus on the person talking.

While most of the satire in the movie is unrelated to IT, there are some in-jokes. In one scene, a developer is getting suspicious towards the boss of it all (who is really a hired actor with no understanding of the business) and asks "do you even know what agile development means?" The actor who is trained in improvisational theater cleverly retorts "Well, can you explain what agile development is?", which shuts up the developer.


> a developer is getting suspicious towards the boss of it all (who is really a hired actor with no understanding of the business)

The Consultant is similar, though one of these obsessively weird for the sake of if productions.


You could extend this technique to realtime random framing. Shoot everything with a fisheye lens, then screen the film with a player application that randomly crops each scene at runtime (applying correction for the fisheye distortion).

Maybe call it Automascope.


Man, I’ve been thinking for years that I should re-watch The Boss of it All. IIRC, it was very funny.

Unfortunately not available on streaming anywhere. Hopefully available via “other means”.


I did recently, and it is still as good as that first time I saw it, together with one other person in an otherwise empty arthouse movie theater. I picked it completely at random, on a day I skipped work, and the whole thing felt completely magical.


Wonderful!


Just trying to get my head around SDR.

What are some examples of practical use cases for a platform like this? What are the benefits of this SDR platform compared to other solutions?


A few practical use cases that are not easily covered by similar platforms:

1) Direction finding thanks to the 8x 153 MSPS ADCs and coherent clocks.

2) Mixed domain analyzer: have one daughterboard act as a RF receiver, and at the same time sample an analogue voltage with the other one. This is a capability reserved to the most expensive of test equipment and lets you analyze how a RF switch is behaving (or do side channel attacks?).

3) Sample almost 600 MHz of bandwidth in real time, use the powerful DSP core to run FFTs on it and send the results over to a browser that implements a RTSA display. This lets you have a real-time view of the spectrum around you for just a few watts. Thanks to the double-PPLs on the Granita board, you can also sweep the spectrum very fast.

4) There is enough processing power onboard to enable RFNM as a 5G RedCap node. We are working with NXP to add an eSIM, so with the right software, this can become a fully-functional 5G UE and connect to the normal cell network. Don't care about 5G? You can write your own standard and deploy it on the same hardware (the limitation here is having access to NXP's DSP development tools, which might limit the processing to the beefy i.MX 8M Plus, but some cores will be available as binaries).

5) Technically, anything requiring an insane amount of ADCs and DACs. You can implement your own board, as the heavy lifting (the motherboard) is already done for you. You could prototype something easily with the development board that's on the website and turn it into a real design within weeks.


Does this offer 4x RX and 4 TX channels? Or is it 8x RX channels? Is each individual channel capable of MSPS?


8x ADCs and 2x ADCs @ 153 MSPS -> 4x RX I/Q pairs and 1x TX I/Q pair. The way the math works, you can sample a 153 MHz signal at 153 MSPS using I/Q, or you can use each ADC line to sample at nyquist, and in that case you get 8x RF channels, each sampling 80 MHz at most. All untested, of course.


Once Arctic Semi puts the Granita into normal commercial production, what do you guess the 1ku pricing to be?


Typo: Is each RX channel capable of 153 MSPS?


Tons of applications and uses in electronic warfare and electronic intelligence gathering.

Also forms a lot of the basis of RADAR, though that's a separate use case.

To oversimplify: SDR allows you to build a radio based around math instead of complex electronics - or, at least fewer complex electronics. It sacrifices a bit of performance but confers a lot of the advantages you get in other software systems - updates, reconfigurability, etc.


Many radio systems are already SDR, just closed. The benefits are the same as building anything into software rather than hardware: you can update it later, support more variations, etc. The downside is more power consumption, so cellphone radios are not usually completely SDR (not an expert on those so correct me if I'm wrong)


This site has some great examples on what various inexpensive SDR hardware can be used for: https://www.rtl-sdr.com/. Use cases range from receiving weather satellite imagery, to reverse engineering radio protocols (e.g. for remote door openers). Several years ago I used an inexpensive DVB USB dongle containing the RTL2832U chip (which enables SDR), to sniff 2G/GSM broadcast packets.


compared to other radio solutions, being able to do more digital analysis on the raw signal lets you do cool things that were previously impossible

at one place, we replaced entire banks of traditional radio scanners using a single SDR

a scanner could only dial into one frequency at a time, but an SDR let us capture entire swathes of the frequency spectrum and extract concurrent streams from multiple sources simultaneously

I'd love to sit down with my Lime and end-to-end my own cell base station... one day



How about you let us know when it says yes.


I have been curious on CLion for some time. What holds me back is the price and the fact that it is written in Java (I know, the last one is a really bad reason).

What are the main pros compared to VSCode?


It's a full IDE rather than a text editor with some IDE-like functionality. Most importantly (to me): much better refactoring tools and much better debugging tools.

No need to immediately hand over money, there's both a free 30-day trial, and you can regularly get an early access (i.e., beta) version of the new version for free.


Modified BSD License (3-Clause BSD license), so it is permissive. Not apparent from header file only.


Are any of these still needed, though? I kinda assumed they would be like Duff's Device and be a nifty historical relic, but firmly in the "Let the compiler do this for you" category in modern C.


Many of those are not standard functions (or were not, until very recently). You have to implement it somehow, so having a ready-made collection that also performs well is very nice to have.


When you're working on Frostbite you definitely need these.

Optimizers are smart, but aren't smart enough yet


How do you let the compiler do these things for you?


By having someone write the optimization passes to detect these :)


Compile with optimizations enabled.


What code specifically do you write instead of the code in TFA, in order to have the optimizer generate the code in TFA? In general compiling with optimizations enabled has not produced particularly good code in my experience.


"For example, the once-popular DEC VAX, DEC Alpha, and Sun SPARC ISAs are extinct."

Extinct is a bit harsh. SPARC V8 is alive and kicking in the space industry as the LEON processor models. Not the most performant CPU, but good enough for a lot of applications.


It would imprudent to base a whole new business on the SPARC ISA (if that's even possible - would Oracle license it?), unless you were in some niche like the space industry where interoperability is important.

And David is quite right that the Digital architectures are really extinct.

[Now I wonder if Compaq would open source the IP of Alpha. I think they licensed a lot of it to Intel so sadly they probably still make a bit of money from patents there.]


SPARC has been "open" for several decades at this point.

All of the ISA specifications are freely available[1] and you can buy a license to manufacture SPARC hardware for $99.

[1] https://sparc.org/technical-documents/


https://en.wikipedia.org/wiki/DEC_Alpha says compaq sold "all Alpha intellectual property" to intel in 02001, so probably they haven't made any money from it since then

also though any us patent filed since 01995 expires 20 years after filing, so any us αxp patents filed in the period 01995–02001 have expired already, and us patents filed before 01995 expire 17 years after issuance, so we only have to worry about pre-01995 us patents if first dec and then compaq and intel managed to delay their issuance, lemelson-style, until 02005 or later, which seems extremely unlikely

so αxp is almost certainly already open-source as an isa

dec's implementations are i suppose not open-source though, and casual glancing around on opencores and the like doesn't turn up any obvious candidates


The people behind LEON have a next generation family of cores NOEL which are RISV-V based.

For the time being they are still selling LEON but I'd expect it to be obsoleted at some point.



I've never bothered to learn Python asyncio. When Python 3.5 came out I just thought it looked overly complex. Coming from a C/C++ background on Linux I just use the select package for waiting on blocking I/O, mainly sockets. Do you think there is something to gain for me by learning asyncio?


Personally, I don't think there is a benefit. If select is working for you, asyncio doesn't add anything performance wise. It is just meant to look more synchronous in how you write the code. But, using select and either throwing work onto a background thread or doing work quickly (if it isn't CPU bound) can be just as clear to read, if not clearer. Sometimes "async" and "await" calls only obfuscate the logic more.


8272 lines of code in a single file.

I guess you could talk about coding standards and code smells all day, but let's not. I'm impressed.


This is pretty common in the C world.

Look up the source code of common Unix utils, too, or a popular game...

LS [0], WolfensteinET (idtech3) [1]

[0] https://github.com/wertarbyte/coreutils/blob/master/src/ls.c

[1] https://github.com/etlegacy/etlegacy/blob/master/src/game/g_...


sqlite has the option of a single file build. They found that the compiler can optimize best within one compilation unit.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: