Hacker News new | past | comments | ask | show | jobs | submit login

When I was around 12 years old, a group of 3 friends decided we wanted to create computer games on the 64 (and later 128). We started to learn about sprites and how to do rudimentary animations. Nearly every day after school we'd share some new thing we had learned the night before. We'd pool our knowledge. It was such an incredible time. Do you know what phrase I NEVER heard during those days? "What, you didn't know that?" I loved that collaborative feeling. If my friend was having a hard time understanding something, I just figured out a different way to explain it. It's such a far cry from the nonsense I see in the various software dev communities today. Perhaps it's because none of us felt like there was a competition between us.



This is exactly it one thousand times over. The C64 was a gateway drug and everybody that was 'into it' really was a kindred spirit.

C64 BASIC has ingrained 1 evil in me. "goto" Atleast once a month I'll be working on some script and I'm stuck but a little voice in the back of my head will say: "a goto would fix this part you are stuck on!" And damn if that doesn't reverberate into the past.

I went the hardware route. I bought additional hardware for my C64. First the 1541 (5.25" floppy drive), then the 1581 (3.5" floppy drive), then a 300/1200 baud modem, then a 2400baud modem.

I've been hardware hooked ever since. I'm now a sysadmin working with *nix servers.

I tried programming. I really did. I meticulously copied one of the sample programs out of the back of the C64 manual. It never worked.


Programming BASIC was a lot like programming assembly language in terms of how the GOTO and GOSUB keywords worked. GOTO led to a lot of criticism over "spaghetti code", but the limitations of GOSUB were worse.

GOSUB would put the calling location on a stack so you could RETURN to it later but there was no stack for parameters, local variables or return values so you had to use global variables for all of those.

You could not write recursive functions in BASIC unless you implemented a stack yourself using arrays. It was easy to compute Fibonacci iteratively, but people had to sort in BASIC all the time and wrote bubble sort, shell sort and other algorithms that were slow but easy to code in BASIC as opposed to Quicksort.

If you have the kind of functions that exist in C, Pascal, LISP, ML, Python and many other languages then you can write a simple Quicksort in a few lines of clear code.


> Programming BASIC was a lot like programming assembly language in terms of how the GOTO and GOSUB keywords worked.

One of the things many people fail to understand when they criticize GOTO and GOSUB is that it is essentially an expression of what is happening at the hardware level. Modern languages didn't really do away with them. They simply added a layer of abstraction that made it easier to develop reliable software. Unfortunately that abstraction also has overhead, which as problematic when you had a few kilobytes of RAM to work on with early personal computers. I would imagine that it was also problematic on the early multi-user systems that BASIC originated on. It isn't that GOTO is evil. It simply became less necessary for developers to use it in high level languages as technology improved.

Of course, the other thing that made spaghetti code inevitable was the line number based syntax. Until development tools improved, people were basically plopping new statements in random locations because they needed more "space" between existing statements. Yet line numbers were used in early systems since the BASIC REPL also served as a line editor. (At least on personal computers. I'm not sure how it worked on mainframes.)


People got BASIC to run in very tiny machines such as the 4K TRS-80 Color Computer, the 1K Sinclar ZX80.

If you added a bit more RAM than that you had more of a choice, for instance a 16K Color Computer could run

https://www.cocopedia.com/wiki/index.php/EDTASM%2B

editing programs with a text editor, saving them on cassette tapes, assembling them, etc. In the same amount of RAM you could have fit a FORTH implementation and with a disk system you could have an experience similar to BASIC based around editing individual disk blocks. With 64k of RAM I would run a C compiler on that color computer and people did the same with CP/M.

So far as mainframes at first they didn't have text editors, instead you would put together a deck of punched cards and submit that to the FORTRAN compiler which would output the object code to another deck of punched cards.

It wasn't unusual for people in the 1980s to use BASIC preprocessors that would read a text file, append line numbers, and let you use structured loops, and GOTOs with named labels. I read about

https://www.pcjs.org/software/pcx86/lang/other/ratbas/1982/

and wrote one for my TRS-80 Coco. It was the sort of thing you could write in BASIC without a lot of understanding about how to write compilers.


MSX Basic had "RENUM" or some such, which would renumber all the lines to for instance 100, 110, 120 etc. It also automatically updated all GOTO and GOSUB. So if you ran that all the time, it was almost like line numbers in a text file, but with an extra manual "RENUM" chore thrown in.


This was standard in Microsoft BASICs.


Dijkstra doesn't describe the go-to feature as "evil" but as "harmful" and it is.

The problem, which you even mention, is that it's impractical to work with larger programs because their control flow becomes too hard to understand. Now, if you have (as many of the earlier BASIC systems did) only 4096 bytes of RAM you can't write such complex programs anyway, you don't have enough RAM. But even by the time these 4K home computers start to appear on the market the price of a modest business computer is tumbling, and such a computer might have dozens of kilobytes of RAM.

Once a flow diagram you can draw on a whiteboard isn't a correct description of your whole program, but merely a high level summary, go-to is just a foot gun.


Definitely a gateway drug. I had an Apple ][+ around that time too. The manuals for that were incredible. Luckily, my dad's startup had tons of old hardware for it. Printers, Koala Pad, plenty of disk drives, modem, EPROM programmer, 80 Column Card with C/PM and Pascal.... even though it was about 10 years old, it was great. But the C64 always seemed far more approachable. That brown box with the clacky keys... And I was too young to know about Djkstra's "Goto Considered Harmful". It probably could have made me a far better engineer if I had picked up better habits sooner.


> C64 BASIC has ingrained 1 evil in me. "goto" Atleast once a month I'll be working on some script and I'm stuck but a little voice in the back of my head will say: "a goto would fix this part you are stuck on!"

goto is perfectly fine if used right. Kernel C code tends to use a lot of goto, to consolidate return points and to clean up resources on the way. A long function with no goto to one common return point, or multiple staggered ones, is suspicious.

The problem that led to the famous paper was rather that goto was abused, used at places where "higher level" control construct like "for", "while" and so on would be better. Partly because the languages just didn't have any, like for example... C64 BASIC.


Yep, used a lot of goto's writing framework software that worked with CoreFoundation on Mac OS. We used it a lot in early Mac Toolbox framework code as well.

Often within the scope of a function we would need to allocate/create dictionaries, arrays, other objects requiring disposal. MacOS's CoreFoundation API often returned NULL when creation (or insertion, etc.) failed. The sane thing to do when you got an unexpected NULL was to "bail" from the function. But rather than return immediately, there was clean up code at the bottom of the function — code like "if stackDict != NULL {CFRelease(stackDict);}". So we often added a label (called typically "bail") just before the clean up code and would "goto bail".

These were the days before garbage collection....


Just sprinkle a few NOPS in your 6502 assembly regularly, so you can replace it later with JMP.


Wait wut?


If you programmed in machine code (without a fancy assembler), you can't simply move your instructions around in memory to make room for bugfixes, so instead you place a couple of nops here and there (usually in strategic places like the start of a function or right after a conditional branch, so that you can later replace those nops with jumps or subroutine calls to add patches.


OK, I get it now. You brought me back. I now do remember peppering my X86 asm code with those nops when writing early code generators, for example if a JMP instruction might go past 127 bytes.


http://www.neocomputer.org/projects/et/ has some really good examples of that kind of debugging (reusing existing space/instructions)


You could also theoretically understand everything (with enough time and patience). Down at least to what all the chips in the C64 did, if not even the transistors inside them. That's what I miss, the feeling I was in control of my computer. Now the complexity is so high I feel like I can barely scratch the surface.


I feel really badly for anybody starting off today. The technology landscape is terrifying and it's really hard to know what to learn or even where to start. With a C64, you could master that machine inside and out very easily and while you were at it, really understand how a CPU works. It's amazing what they were able to do with a CPU with 3000 transistors (compared to the 50 odd billion in modern processors)


> Do you know what phrase I NEVER heard during those days? "What, you didn't know that?" I loved that collaborative feeling.

There was also far fewer variations of things to confound that info sharing. You had a c64, or 128, or Atari 800? You had the same everything as everyone else - same manual, same BASIC, same registers, same books available. You didn't have to worry about what version of something you had, or whether something got upgraded, or what video card you had, etc.

When there was a disagreement about something, it generally wasn't hard to at least point to a common base to start from.


The C64 Programmer's Reference Guide[1] was the game changer for me. 6510 instruction set, the computer's full memory map, register maps for all the chips, detailed information about I/O, the KERNAL, and so on. I don't recall it coming in the box, I had to save up my $$$ and actually buy the book.

1: https://www.c64-wiki.com/wiki/Commodore_64_Programmer%27s_Re...


Yes, and don't forget the great schematic diagram in the back! It definitely didn't come in the box with the C64; I had to save up as well. I still have my copy, tattered though it is, sitting here on my office bookshelf. That, and the book _Assembly Language Programming with the Commodore 64_ [1] changed the way I viewed computers.

[1] https://archive.org/details/Assembly_Language_Programming_Wi...


Mainly or maybe even only in the USA, I think.

Over the in the other significant English-speaking economy, we were a lot poorer in the early 1980s, and as such, anything priced in US$ was too expensive.

So things like Atari 8-bits didn't sell well here. In fact only the budget C64 did, and it was an expensive machine in early-1980s Britain.

Which was a good thing, because it encouraged a flourishing local market in locally-made computers.

Although the C64 sold in the millions, and so is familiar to many, Sinclair's ZX Spectrum was even more common over here. And although we didn't know it back then, it was huge behind the Iron Curtain too, in the form of dozens and dozens of unauthorized clones. Every Communist nation had its own ZX Spectrum clone, or maybe several. Some adapted to display Cyrillic, some built from imported bits and some from Soviet bits, some with discrete logic in place of Sinclair Research's ULA.

And all of the Euromicros had better BASICs than the C64.

Sinclair BASIC wasn't great but it did graphics and sound. The best was BBC BASIC on the BBC Micro from Acorn. Named procedures, with local variables and recursion. IF...THEN...ELSE, various loop constructs, and inline 6502 assembler.

I reckon it's partly the terrible BASIC of the C64 that turned everyone against the language:

https://liam-on-linux.livejournal.com/71381.html


In fact only the budget C64 did, and it was an expensive machine in early-1980s Britain.

I think you suffered from what I did in Canada at the time, pre-free trade duties. Trade is so duty unencumbered now, comparatively.

The c64 was 2x or even 3x the price in Canada, mostly due to duty, compared to US pricing.

I recall buying a unit in the US, after convincing my parents to smuggle it across in their car...


You could be right. I was 12 or so -- I paid little heed to such things then. :-)


It really was an amazing time. Seeing that bootup screen with the blue on blue still turns on that feeling of, "Exciting discovery awaits".... Prior to accidentally taking "Computer Science" in school (it was that or drafting), I had no idea what it was about. It was like trying heroin for the first time. Commodore Pets at school and a year later, I worked my ass off all summer the save up to buy a C-64 when it first came out. Parents chipped in for the tape recorder.


I had that experience. We were an apple2 middle school and a bunch of us would stay after and work in the computer lab. I wrote a little game in low res graphics. We'd trade tips. Lamentably the teachers weren't much help. We'd ask the music teacher who would be in there sometimes as he seemed to have some good knowledge ("music and the apple 2 wasn't easy.. you could "click the speaker" and by doing so rapidly you could get tones..)

We had access to "Nibble" magazine which had a lot of printed code. I got my "sound" routines from that. Between that and "Beagle Bros." programs. It was fun. We never figured out machine code...We got the basics, but it was just too much.

Though we just had each other as our lab wasn't internet enabled (as was the style in the early 80s)


Ah good old `GR`... `PLOT`. When I found about `HGR`, `HCOLOR` and `HPLOT`, I thought it was the greatest thing ever. And having that tiny little built in speaker made everything I did with PEEK and POKE sound so terrible. But, it kept me trying new things over and over again.


Another thing was not having the perspective of what was possible or not, no such thing of "you have to use X to do Y".


This.

My first thought back then with my C64 and BASIC: "I'll make it so when someone types LIST they won't be able to see the code, by writing more BASIC code that prevents it".


REM<SHIFT-L> - it's weird the things that stick in your head after all this time. Likewise SYS 64738, 64760, 64767 and a few others. One to really mess with people was: POKE 53280, 0 # set screen to black POKE 53281, 0 # set font/foreground black SYS 64767 # fast restart which doesn't unset colours. At that point unless they know what's happened they cant RUN-STOP/RESTORE to get out of anything.

I do miss the 64 (and the Amiga which followed). The variety of machines back then was really refreshing compared to now.


Well said. Perhaps it was also because you and your friends had the humility to realize that it was you who didn't understand something a few days earlier. Helping others to learn and working with a group where learning is allowed and nurtured is incredibly important.

“The best thing for being sad," replied Merlin, beginning to puff and blow, "is to learn something. That's the only thing that never fails. ... There is only one thing for it then — to learn. Learn why the world wags and what wags it. That is the only thing which the mind can never exhaust, never alienate, never be tortured by, never fear or distrust, and never dream of regretting. Learning is the only thing for you. Look what a lot of things there are to learn.”

― T.H. White, The Once and Future King

https://www.goodreads.com/quotes/21627-the-best-thing-for-be...


> Do you know what phrase I NEVER heard during those days? "What, you didn't know that?" I loved that collaborative feeling.

If someone says "What, you didn't know that?" about something they only learned yesterday, they're being a jerk. Because this was all new to all of you, it became harder to feel like saying that.


Where there times when people who discovered sharing an interest were more keen to mine into each others insights or differences, than of late? Where people generally keener to increase and test and propagate their outlooks in person. Perhaps just my own foibles or relative age, but in later years it seems more contentious to conversationally dig into and challenge others takes on their subjects of interest. Its a though as individuals we have no call to query an others interest, no more than to pleasantly listen and affirm them. Perhaps with interests now developed more through information technology, where contrasting perspectives are already collected and competitively marked, it has become harder to be intellectually generous and curious in person.


What's more sad is that nowadays you're expected to have not only breadth of knowledge (sacrificing depth) but to also learn fast. We romanticize speed. We want the chef who creates quality food in fast food speeds!

Something that the author of the article could also touch upon is that, lack of resources was also a boon. I recall reading interviews of great programmers of the yesteryear saying how they would "read the same book twice over to understand deeply" and "sometimes I'd read the same concept from another book to make it really click", one of them being John Carmack!

A naive assumption I made after reading statements like those was this: When performing, sometimes you go fast to pressure yourself but under that pressure the odds of delivering quality work is greatly reduced. However, when comprehending something new, you cannot romanticize speed. You almost have to value comprehension and depth. Speed comes as a byproduct of the hours spend understanding.

Instead of that we favor speed of questionable learning quality, we tag ourselves "jacks of all trades" because you know enough Next.js to ship the product but not enough to tell what you could have done better or even why we chose to do A over B outside of "the stackoverflow answer told me to". That is, until the client complains. We really misuse the whole "premature optimization is the root of all evil" to "premature readability", "premature anything". Gotta go fast!

tl;dr aside from the programming environment, not having access to "quick and dirty" answers to any problem affected everyone's expectations of delivery and performance, and that also played a role in how we learned.


“We” in your statement is the business community. Developers are craftsmen, and I don’t think I know a single one personally that values speed over quality. It’s the businessman that don’t want to pay for quality, rather the minimum to turn a buck. The bottom line. Which isn’t necessarily a bad thing, but it is a bad culture to embrace. There’s a reason Toyota is the number one car manufacturer in the world today. And it wasn’t from bean counting and focusing solely on the bottom line as these western so-called business geniuses embrace.


I wonder how much of the difference is because you were friends working collocated, and not mashed together arbitrarily as coworkers communicating via Slack and cameras-off calls.


There was probably a great deal of that. We also played instruments together in an attempt to be "in a band". I didn't find the uppity know-it-alls until college. Discussing Delphi/Object Pascal with people via NNTP was definitely my gateway into, "everyone is smarter than you".


Any cool band names that came out of it? :)


Your age is off just a bit - I almost thought you might be the founder of a trio of C64 (and C Pet) programmers with the same idea. It was an exciting time to be 'into' computers!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: