Yes, very true, obviously I just mean conceptually poking around. In many ways we definitely have it better now. I am infinitely more productive today. Simultaneously have gone further from teaching the low-level stuff properly.
I spent hours, days, and weeks of my life on things related to graphics/games programming such as:
- Building mouse drivers from scratch or implementing them from alternative vendors
- Reverse engineering consoles to steal processing power from idiotic sources just to render a tiny bit more data or later, a few more polygons
- Spending a huge chunk of cash to throw in a math co-processor into my machine at home
- Debugging code for hours only to realize things like the problems are caused by seemingly unrelated problems such as tape media is at fault, the floppy is corrupt, or the file system doesn't work the way the vendor said it does in the specs
- Rendering a scene and then going home, only to come in the next day and see it is still not done
- Converting from 72 billion formats and interpreting, finding, and/or correcting corrupt data from each one
- Rewriting entire pieces of code bases to squeeze out several more bytes of EMS and XMS
- Implementing 2 or more graphics APIs for the same game. Thank you 3dfx, S3, and many others that pained me, not to mention at a higher-level, OpenGL, DirectX, and so on.
- Doing all my work on 1 platform, then loading it on another. Thank you SGI for taking years off my life.
- Writing matrix operations in pure assembler for the simplest of operations
- Having multiple workstations for reasons such as "this one has the Matrox card in it."
The list goes on. Yeah, I don't miss those days. But I learned a lot, we all did. And the barriers to entry definitely reduced the signal to noise IMO.
There always has been and always will be problems, there are just different problems now...
We can't even imagine what kinds of crazy stuff the next generation will come up with, with all the resources they have available now. The barriers to entry are still there, but the goalposts have moved significantly.
I spent hours, days, and weeks of my life on things related to graphics/games programming such as:
- Building mouse drivers from scratch or implementing them from alternative vendors
- Implementing/Working with DOS protected mode
- Manually compressing memory/implementing swapping
- Implementing blitting from scratch
- Implementing z-buffers from scratch
- Reverse engineering consoles to steal processing power from idiotic sources just to render a tiny bit more data or later, a few more polygons
- Spending a huge chunk of cash to throw in a math co-processor into my machine at home
- Debugging code for hours only to realize things like the problems are caused by seemingly unrelated problems such as tape media is at fault, the floppy is corrupt, or the file system doesn't work the way the vendor said it does in the specs
- Rendering a scene and then going home, only to come in the next day and see it is still not done
- Converting from 72 billion formats and interpreting, finding, and/or correcting corrupt data from each one
- Rewriting entire pieces of code bases to squeeze out several more bytes of EMS and XMS
- Implementing 2 or more graphics APIs for the same game. Thank you 3dfx, S3, and many others that pained me, not to mention at a higher-level, OpenGL, DirectX, and so on.
- Doing all my work on 1 platform, then loading it on another. Thank you SGI for taking years off my life.
- Writing matrix operations in pure assembler for the simplest of operations
- Having multiple workstations for reasons such as "this one has the Matrox card in it."
The list goes on. Yeah, I don't miss those days. But I learned a lot, we all did. And the barriers to entry definitely reduced the signal to noise IMO.