I learned Turbo Pascal in an MS-DOS emulator running on old green-screen Burroughs smart terminals, circa 1987. I knew the terminals had some sort of graphics capacity because their font changed when they left VMS mode and started emulating MS-DOS, so I wrote a program to rummage around in memory until I found where the font designs were stored. Then I wrote a font editor that changed the standard font whenever I logged in, to a design based on my own handwriting. After that, I took a leaf out of the Microbee computer's books and emulated hi-res graphics: I wrote a program that printed all the ASCII characters from 33 to 255 in a rectangle, set their font definitions to all zeroes, and then selectively set individual pixels back on according to a pattern that assumed the exact layout of characters. Implemented line, circle, flood fill and a few other graphics primitives. Fun!
One of my first "public" programs that I made was written in Turbo Pascal and was a read me style app for a demo group (I wasn't anywhere near capable of doing what the rest of the group could do - but at least I found a way to contribute) - I used a custom made font, had a scrolling buffer etc - always nice to take a stroll down memory lane.
Pretty sure it was the way hi-res graphics worked on the VIC-20 too -- 22 characters by 8 pixels per character = 176 pixels wide, multi-colour mode where each block of 8x8 pixels had a background and a foreground colour. Deeply freaky.
The Vic also had a mode whereby you could make "double height" characters (halving the number of addressable rows), so when you typed "a", it would emit (iirc):
a
b
and when you typed "b", it would emit:
c
d
When I was about 8, I fooled with this peeking and poking pixels so that "a" and "b" bitmaps where the top and bottom halves of a double-height "a", through to the end of the alphabet. I imagined that this would make it easy for my grandparents to use computers. Never mind that they couldn't type, had no interest, and would never remember commands like "load $,8,1".
On the ZX81 and ZX Spectrum, they called them "User-Defined Graphics" or UDGs. I think I wrote a pixel-flipping grid font editor for characters on every computer I had access to! The ZX81, the ZX Spectrum, the Apple II (once I noticed that one of the games I had (Taipan?) hooked the text output routines into a hires character generator), and eventually various x86 programming environments, although there you were editing bitmaps, not changing the character generator source. :-)
Alright this is the most awesome post I've read all day :) I really need to get to work putting together the kit Microbee that's currently sitting on my other desk....
Are you old enough to remember daisy wheel printers? Basically, the computerised equivalent of a typewriter, with the individual letters on a daisy-like wheel that spun around under computer control to produce slow but "letter quality" printing. When I showed people my handwriting font, they were very impressed and asked if my printouts would be in my handwriting too. Given that the only printers we had were daisy wheels, I suggested that perhaps this would not be happening.
Some daisywheel printers could be coerced to print high-resolution graphics, by repeatable printing a few zillion periods and using the variable line spacing and tab settings.
'Fast' was not a word one would use to describe them, though, even compared to the matrix printers that took minutes for each page of high-resolution output.
Coming back to the original post, I wrote a proof-of-concept Braille output program for a daisy wheel printer. Basic idea was: take a line of text input, eg "hello". Convert to Braille ("⠓⠑⠇⠇⠕"), but reverse the dot patterns to make "⠪⠸⠸⠊⠚"). Sandwich a sheet of paper towel between two sheets of paper, feed it into the printer and print the reversed dots using "." and space, micro-positioned.
Theoretical result: raised dots punched into the paper by the daisy wheel, able to be read by a blind person.
Actual result: the proportions were wrong for reading, and most of the time the paper jammed on the roller because of the padding.
So why don't we have full blown graphics/widgets in the terminal?
I'm thinking a hybrid console meets explorer: When I type "ls *.png", why don't I get a sortable table? Or, if I mouseover a filename, get a preview?
I remember "type BBSAD.ANS" in the DOS days - you'd see an animated ANSI right in the terminal - but I can't type "view lolcat.png" in 2014 (without shifting focus to the GUI).
Maybe this is overkill and not worth it, but I've always felt there's room to make the CLI a little more GUI-like
The command line is contextually intelligent and presents interactive graphics and text. Not a simulation of a teletype on a simulation of a character terminal.
Why should we have graphics in the termal? Use the GUI whenever you need to do these things.
> Or, if I mouseover a filename, get a preview? ...I can't type "view lolcat.png"
Most GUIs don't even do this. However, on OS X with iTerm you already can cmd-click the filename to open the file. Or use the Finder's QuickLook utility from the command line: qlmanage -p "$@" >& /dev/null &
Basically I see no need to merge the two. They operate perfectly well together and compliment each other. To extend the functionality you could simply write shell scripts which generate custom HTML and send it to a browser (or quicklook in the case of OS X).
Furthermore, a lot of these suggestions are already implemented in ipython notebooks, which is a REPL with graphical capabilities.
qlmanage -p is cool. I tend to just use "open" personally, as in open . to see the current folder or open lolcat.png to view a file in the default editor. On Windows you can use explorer.exe to do similar things. I do, however, wish terminal had better GUI for history, scrollback bookmarks, note taking, etc. Especially if I've a 4K display with so many pixels, it's criminal to not use a few of them more effectively.
That's sad. I remember when the author first announced Termkit on Reddit and got massively shat on for trying to introduce graphics (the very thought!) to something as serious as The Command Line.
I hope he hasn't totally abandonded the idea but I wouldn't be surprised if he did.
It's not exactly a terminal, but I like the IPython Notebook for pretty much those reasons. I think you could quite easily build something like a sortable ls output as a reusable component.
We do in fact have full-blown graphics modes in the terminal and have had for decades. You can switch xterm to Tex4014 mode at your leisure, although this might not have been the graphics you had in mind. I believe you can _still_ pay money to HP for the pleasure of using decterm, which emulates the VT340 and other REGIS-protocol terminals.
One of the craziest hacks on graphics terminals was NCSA Telnet. It supported multiple detached Tek terminals as well as a raster graphics protocol called "ICR". It had its own TCP stack built in, and could offer these features even on DOS. One of the weirdest rigs I ever saw was a guy at college who browsed the web, with graphics, using NCSA Telnet on a PC/AT running PC-DOS, at a time when that computer was at least 10 years old.
To play with it, for example gnuplot can output on an xterm running in tektronix-mode. Open an Xterm, then...
* mkfifo /tmp/tek
* cat /tmp/tek
* Ctrl-Middle-Mouse menu:
- show tek window
- switch to tek window
- (in Tek window:) hide vt window
Open a second terminal (Xterm or any other)
* start gnuplot, on the gnuplot-prompt
* set term tek40xx
* set output "/tmp/tek"
* plot tan(x) {or whatever you want to plot}
Using two separate terminal emulators, and the fifo, makes it easier to coordinate the two different modes, because frankly gnuplot doesn't switch sensibly between the two display modes and will clobber one or the other either with Tek control codes on the VT, or write the prompt over the graphics.
I've got a couple VT340's in storage, they have ReGIS graphics on them which is quite fun. It would be great if someone decided to implement ReGIS in one of the Linux or Mac terminal emulators because all those cool DEC BASIC games would suddenly work great.
The only limitation I've found is that the standard ppmtosixel conversion tool spits out 8-bit control-characters, but xterm seems to only recognise their 7-bit equivalents (hence the sed command in that screenshot).
It won't let me edit the comment, but I should throw in: xterm must be compiled with `--enable-regis-graphics` (or `--enable-sixel-graphics` for sixel graphics) for it to work. Your distro's xterm might not have been compiled with those flags.
This was the case for me, but rebuilt from source and it worked fine. Given the code exists I wonder why it isn't more common in terminal emulators. Its really handy when you want to pop up a simple picture or a line drawing.
Surprised that you missed it, so I'm still not quite sure if maybe I misunderstood something in your question, or you were just having some healthy holidays around that particular time :)
[EDIT] Oh! For more added fun, I've stumbled upon ReGIS's sibling for bitmaps, SIXEL; some pretty screenshots e.g. at:
I think this is related too.. for 256 color terminals, a little off-the-cuff go lib I wrote that converts pictures to blended colored ascii https://github.com/minikomi/ansipix
A semi-related play-thing project (which seems to have stagnated completely now: last update 2001 which makes me feel old) is AALib: http://en.wikipedia.org/wiki/AAlib
If you want to give yourself a migraine, try playing through Quake2 using that as the output device...
Back in university, we had lots of Falco t310 terminals.
(VT100-esque terminals connected to a serial port.)
Occasionally, I'd accidentally cat an executable file to the terminal and it would sometimes switch to a graphical mode and draw lines on the screen like an 80s BASIC program. Clearly there must be some control codes lurking in that executable that coincidentally work as graphical instructions.
I tried experimenting with cutting the file up and into pieces to narrow down the magic control codes, but the number of bytes with a value of 7 in that file started annoying the people around me.
They replaced those terminals with Windows PCs the next year.
Why use braille when you can have a real image? A really awesome feature of iTerm 2's new beta is the ability to display PNGs right inline with the browser[1]. It's easy to use and can create some really cool effects[2].
That reminds me of an old idea from lcamtuf (1st on http://lcamtuf.coredump.cx/soft/obsolete/STUPID_IDEAS.txt) - replace fonts while text is being displayed directly on VGA/EGA. I wonder is anyone attempted to implement it in the end. This project comes really close though.
I believe the old music software Impulse Tracker did something like this. It was in text mode, yet had smooth mouse cursors, audio visualizations and other "graphical" things.
A number of text mode programs had smooth mouse cursors. They modified extended ascii characters to produce the effect. Some intros did some limited animations (e.g., scrolling starry backgrounds) using the same approach. This and cycling color palettes made for some pretty snazzy effects.
You're right about that. For some reason I have it in my head that IT did something more dynamic with the characterset to produce the graphics, but I can't find the source for that other than the recent blog posts by Jeff Lim.
It didn't have braille characters, but my old Kaypro 2x had a blocky, character oriented form of pixel graphics. (The rather fat pixels needed to be rolled together into a "character" that was outputted, as best as I recall.
It was enough to be able to display some chunky looking but workable fractal images. Somewhat similar to the time I generated the like on a Dec something or other and printed them on the available dot matrix printer.
Reminds me of the TRS-80 Model I, which could do "graphics" using special characters in the space above ASCII 127. Each such character was a 2x3 pixel grid with certain pixels turned "on" or "off". With 64 such characters, it was possible to plot arbitrary monochrome graphics (albeit with really chunky pixels) on the screen. Which is how Dancing Demon and other TRS-80 programs got their famous chunky appearance.
If you want to get into the background of Mystery House - not how to play it but more the background of how it was made and fits into the larger context of interactive fiction - then you should read Jimmy Maher's description at http://www.filfre.net/2011/10/mystery-house-part-1/ along with part 2. Those are part of a very long and enjoyable description of the evolution of interactive fiction, the developers, and the computer systems.
> Specifically, he wanted to bring FORTRAN, as it happens the implementation language of the original Adventure (not that Ken likely knew this or cared), to the little Apple II. With that purpose in mind, he registered a company of his own, choosing to call it On-Line Systems, a name fairly typical of the vaguely futuristic, vaguely compound, but essentially meaningless names (Microsoft, anyone?) that were so common in the era.