Curious if anyone knows of a short overview of CS or Engineering topics(i.e. linear algebra, diff eqs, etc) for keeping refreshed as you age. I'm not talking about a full tutorial, more like something you could scan through every few years just to make sure you're not forgetting the important stuff that you haven't used since college.
As a side note, I recently interviewed with a couple companies in the valley and the focus of the interviews was not on 'gotcha' algorithm questions. There were a few, but I came away feeling that the process was more fair and reasonable than I'd been led to believe. Now, granted, I'm an embedded software guy and maybe they tailored the questions for me, but I feel like I wasted a month going over graph algorithms. It was a good learning experience, but my time would have been better spent reviewing more basic embedded topics.
You might look at Laub [1] for a short, somewhat advanced overview of linear algebra with some differential equations, as it's applied to classical engineering and science.
Skimming the materials at Open Courseware is really good. Just reading the problem sets from the thermo course I took 30 years ago brought back lots of buried memories and forced me to learn some stuff I honestly hadn't at the time.
It's not taking an online course -- it's all the material handed out with an offline course. https://ocw.mit.edu/index.htm
"my time would have been better spent reviewing more basic embedded topics." In my own experience embedded topics have all been a derivative of my experience working on embedded systems( I Interviewed for a EE role). What kind of embedded questions were you asked ?
Some things relating to board bringup(power-on to when the bios boots the OS, how DDR memory is initialized, etc). Basic algorithms(bit counting), atomics, virtual memory, malloc()/free() implementations, etc. Given system X and problem Y, what could cause problem Y.
Not strictly embedded, but definitely not graph theory.
>A math genius called Alan Turing joined the British military to crack the German “Enigma” code. He knew they would never get ahead if they keep doing the calculations by pen and paper. So after many months of hard work, they built a machine. Unfortunately, It took more than a day to decode a message! So, it was useless :((((
I could not read past this. Yet another distortion of what was done at Bletchley Park. Yes, I understand, keep your readers interested, but not by making up stuff and claiming it was real. I think the author should find a better example of an import use of algorithms than this one.
To answer that would take an entire book! One book you might try is "Battle of Wits" by Stephen Budiansky, published in 2000 by The Free Press. ISBN: 0-684-85932-7
There was never a single break through at Bletchley. The Bombe machines (there were eventually many of them) were useful, but were just one part of the overall effort, and they did not always get results. The idea that all messages ended with "Heil Hitler!" is just wrong.
Alan’s team found out that every encrypted message ended with the same string: “Heil Hitler” Aha! After changing the algorithm, the machine was to decoded in minutes rather than days!
As a side note, I recently interviewed with a couple companies in the valley and the focus of the interviews was not on 'gotcha' algorithm questions. There were a few, but I came away feeling that the process was more fair and reasonable than I'd been led to believe. Now, granted, I'm an embedded software guy and maybe they tailored the questions for me, but I feel like I wasted a month going over graph algorithms. It was a good learning experience, but my time would have been better spent reviewing more basic embedded topics.