Hacker News new | past | comments | ask | show | jobs | submit login
Picotron Is a Fantasy Workstation (lexaloffle.com)
675 points by celadevra_ 7 months ago | hide | past | favorite | 192 comments



I had a good time with PICO-8 - and I think it retains its core appeal - but I've moved on to "genuine" retro hardware with the new crop of machines like CX16, Mega65, or my personal choice, Agon Light. The specification ends up being tighter when there's a board design, chips and I/O ports, and these new machines, like Picotron, are relatively uncompromised in what they can achieve within the I/O spec. You can emulate them, talk to the hardware directly, run BASIC or C or Forth or whatever other language.

Lua might be too slow to run interpreted on real 8-bits as in the Pico series, but it can be used as the base for a cross-compiler instead, and that presents a different spin on the specific coding challenge: Why not create an ultimate development environment, something that generates the precise code needed for that type of project? That's the direction that the highly optimized PICO-8 games took, and it is likewise seen in new demos for C64, Spectrum, A800 etc. - the "big hardware" is leveraged towards the old stuff in a way that can ignore the assumed paradigms of both.


> but I've moved on to "genuine" retro hardware with the new crop of machines like CX16, Mega65, or my personal choice, Agon Light.

I just wish these would move on from the same crop of retro CPUs (z80, 6502, maybe 8080) and clone VDPs on FPGA. I want a retro-style 2d/blit-based machine, but with more advanced hardware. Maybe a Cortex-M, z8000, 68000, low-end Risc-V, etc. Still give it BASIC in a boot ROM, but some more 90s style headroom to grow into.

I guess what I'm saying is, I totally get that all these people grew up on Commodore 64s and are trying to recapture that magic. However, the Amiga/Atari/BeBox/etc hacking days shine way more with me.


I would love to see the alternate world in which the z8k or 68k were finished in time for use in the IBM PC. Intel is dominant today almost entirely due to the 8086 being available 6-12 months earlier than competing 16- and 32-bit processors.


Maybe check out the Colour Maximite 2 with a Cortex M-7 [1]

Very much in the spirit of the early home computers (inc a decent BASIC) but with a lot more oomph.

[1] https://geoffg.net/maximite.html


I have one of the originals and spent hours messing around with it. It's a great little computer and MMBasic is pretty nice. Thanks for the memory reminder.


I have to imagine that this is the purview of embedded 2D microprocessors / OSes like Linux4SAM.

https://www.microchip.com/en-us/development-tool/linux4sam.o...

With well made compute modules: https://www.digikey.com/en/products/detail/microchip-technol...

And open reference designs that fit on 4-layer boards (!!!!) despite using DDR2. Though I think most people would be more comfortable with 6-layer boards (which is possible with OSHPark today).


Something like an STM32 Discovery board is a good option for recapturing the mid-90s magic. You can get a ~200-MHz Cortex-M4 or M7 with a few MB of flash, external SDRAM, and a display for less than $100. They have really basic hardware 2D accelerators.

The on-chip peripherals are well-documented, but off-chip peripherals require some digging to figure out how to program correctly.

You can debug with GDB surprisingly easily, or find a Forth to throw on there and just start poking registers.


No. Microcontrollers are the improper solution for this problem.

You can run full blown Linux efficiently at 500MHz or 600MHz processors like STM32MP1 processors, powered by AA batteries or other small battery packs.

There's also SAMA5D2, and a few other competitors in this space (both above, and below, the STM32MP1).

When we're talking about "consoles", that's "plug-and-play executables", meaning you now want a proper compile / library -> ELF + loader == Linux kernel, security, etc. etc.

Besides, a DDR2 chip gets you like 512MB of RAM for $4 and easily fits within the power-constraints of AA-batteries. There's very little benefit to going to the microwatt-scale devices like STM32 Discovery.

----------

Microprocessors for the win. Entry-level MPUs exist for a reason, and there's a ton of them well below Rasp. Pi in terms of power / performance.

There's many at the 2D level of graphical performance, but 500MHz is still a bit low for this. You'll probably want to reach into faster 1000MHz / 1GHz MPUs and push into STM32MP2 if you're reaching into 3d levels of performance. (Which is beginning to look like a cut-down cellphone chip really)


I guess it depends on which part you think is fun. Using a big microcontroller is more about pushing the hardware to its limits. Using a small Linux system is about taking advantage of existing libraries. The Playdate has an STM32F7 and it seems to do pretty well as a console.


As there are several available, is there one in particular that you would suggest for this use case?


I liked the 32F746GDISCOVERY which is $56 at Digikey. It has a Cortex-M7 CPU, 1 MB built-in flash, 8 MB of SDRAM, and a 480x272-pixel touchscreen. Games can go on a microSD card. There's a USB OTG port you can use for input.

A low-res screen like this works well because the chip can't rescale its video output.

ST provides libraries for all the peripherals so it's pretty easy to jump in if you know C. I think microPython works on a lot of these boards, too.


it looks like the agon light actually runs an ez80 which runs pretty fast and can address the whole 512k of ram without paging which does give you that sort of late-80s/early-90s headroom


The eZ80 is indeed quite fast, and the 24-bit space is a comfortable size for values as well as addressing - I've been working with it in Forth and haven't felt deeply constrained by that size(occasionally needing the double number operations, but nothing more than that). The graphics spec is a little bit below most 16-bits in terms of color depth, since it's VGA 2-bit per channel, but the screen resolutions also go quite high, so I expect a lot of 640x480x16 or 800x600x4 games.

Meanwhile, the ESP32 acts as an ultimate co-processor through the VDU protocol inherited from the BBC Micro. That's a part of the architecture that I really appreciate, because it positions software design around the serial I/O and how effectively you delegate your tasks to the VDU. Early reactions from people who are used to 8-bit coding were a bit perplexed because they couldn't push a lot of stuff down that pipe, but as the firmware has developed, the ability to define complex, shader-like programs has also built up. Nothing stops you from describing, e.g., a tilemap engine in terms of a program that stores map data, tiles, and sprite assets in VDU buffers, and then launches a program to do all the array processing needed to blit the tiles and display the sprites when given a camera position.

That's cool because it means that your graphics engine is decoupled from the CPU language. The same VDU sequences will work in Basic, assembly, Forth, C, etc.


I think this is less people who grew up with C64s and more people who didn't trying to capture the magic without having to learn assembly or making sprites with graph paper and a hex editor.


Sure, but there was a whole other magic era that followed that. That also got to work with low-level assembler and direct framebuffer/video access. But wasn't constrained to 320x200x8 screens with pictograph fonts ("petscii") and POKE/PEEK.

It had other limits to explore other than shoving as much as you could into 64k of memory and 1Mhz of processing power.


Mega65 is on FPGA exactly


Enough already with retro hardware that reproduces the early 2d games aesthetic. We need to step up the game and make retro hardware that reproduces the early 3d aesthetic.


This is completely my subjective opinion, but I find "early 3D" ugly. In my opinion, pixel art has aged better. In fact, some of it still looks beautiful to me. But early 3D I find almost completely unappealing.

About the only exceptions I can think of are some flightsims like SSI's Flanker (which had very complex graphics for being simply flat-shaded 3D) and the games that emulate this nowadays, like Tiny Combat Arena.


Early 3d generally hasn't aged as well, probably due to extremely low resolution textures and so forth.


The development cost -- of the tooling and the games, let alone hardware! -- is too high. Additionally, 3D games aren't products of their host hardware as much as (older) pixel art games are/were.


I would start with the MIT CADR CPU in an FPGA and add modern hardware round it like Ethernet, USB host, 2d blitter, etc...


You can run Commodore VIC or IBM PC wigh CP/M, MS-DOS, Windows 3.0, Linux ELKS on ESP32 using FabGL.


I’ve often thought the TMS340X0 series would make a good hobby target.


They were interesting to program for.

I started using a graphics card with a 34010 before there were tools available for it. I did ports of GNU binutils (gas and ld) to target it then wrote some firmware in assembler for an X11 server to call to do basic tasks like filling rectangles, line drawing, copying with colour expansion for displaying fonts.

Still have the ISA card but haven't powered up the machine it is in for a long time.


That’s cool. I’m professionally jealous of all the people that got to program neat chips back in the arcade heyday.


+1 My kids and I had a lot of fun with Pico-8, building simple games and learning basic geometry.

The community (inherited from Pico-8) is already implementing cat/wget/grep[1] and, of course, Minesweeper[2] in Picotron! Whatever Joseph White/zep is building brings back the early days of Internet and IRC where the everybody builds and shares unashamedly while having a ton of fun!

Thank you zep for making computing fun again for more mere mortals!

[1]: https://www.lexaloffle.com/bbs/?tid=140771 [2]: https://www.lexaloffle.com/bbs/?tid=140678


I'm curious how old your kids were when they started hacking on PICO-8 code?

My son (7yo) likes block-based programming (using Scratch, Scratch Jr and Octostudio) and Minecraft, but I'm wondering what a smooth on-ramp might be for PICO-8 or similar.

I got my first computer when I was about 10yo, so I was content to read through the books that came with it to learn the basics of BASIC and a little 6502 assembly. But I don't think that will work due to age, availability of other devices etc.


My kids are 12 and 14 and I can't get them interested in coding beyond what they might do at school. They showed an interest in Scratch, but I believe I introduced it WAY too early. Moreover, it moved them too quickly past the creative aspects and into writing code. Also, years later, I showed them PICO-8 and they weren't terribly interested.

In hindsight, I would recommend working with them at a young age (<10) to design game art and ideas. Then, the parent implements it and ports it to a portable platform. The child sees the creative aspects and the final output, but is shielded from the coding side in the early days. I imagine a child playing a game they designed on paper with crayons would be really satisfying. It would almost be like magic!

Then, let the transition to the coding side happen more organically or through a school program or some such. Maybe when they finally ask, "So, parent, how do I actually code these games?"

Just my non-data backed opinion...


That's what I've been doing with one of my kids. They're designing the sprites and maps in the PICO-8 sprite editor and I'm taking the lead on showing them how to do the rest.

They've also enjoyed tweaking the sprites of existing PICO-8 games.


Yep. My 7yo son is the graphics and gameplay designer, I am the implementor


> Lua might be too slow to run interpreted on real 8-bits as in the Pico series

Would it necessarily be all that much slower than Basic? It's a very small and othogonal design.


There are efforts to port pico8 to microcontrollers, but the real problem with lua is memory (easily requires 4MB of memory which is only available on high-end microcontrollers).

https://github.com/DavidVentura/PicoPico


Ok, but which language feature(s) of Lua is it which inherently requires so much memory? I understand you wouldn't exactly get a 100% standards compliant implementation, but what are the hard parts?


With Lua's design, each bytecode operation requires a bunch of memory accesses, these microcontrollers only have a limited amount (~500KB) of SRAM, so you need to place this memory on PSRAM (RAM over SPI) which has "significant" latency for these microcontrollers.

It's definitely possible to use standard Lua and run _some_ of the Pico8 games, but not all.

Lua itself does not require a lot of memory, but PICO-8 guarantees 2MiB of usable RAM


Nitpicking, according to Wikipedia, PSRAM is Pseudo-SRAM[1]. The fact that it's accessed through SPI is an implementation detail.

1: https://en.wikipedia.org/wiki/Dynamic_random-access_memory#P...


There is a Lua implementation for microcontrollers called NodeMCU.

> Lua based interactive firmware for ESP8266, ESP8285 and ESP32

https://github.com/nodemcu/nodemcu-firmware

A big difference I see between this Lua and PICO-8's is that the former is compiled, whereas the latter is interpreted.

How it manages to run Lua with such limitations, the documentation of Lua Flash Store (LFS) goes into detail.

> The ESP8266 has 96 Kb of data RAM, but half of this is used by the operating system, for stack and for device drivers such as for WiFi support; typically 44 Kb RAM is available as heap space for embedded applications. By contrast, the mapped flash ROM region can be up to 960 Kb, that is over twenty times larger. Even though flash ROM is read-only for normal execution, there is also a "back-door" file-like API for erasing flash pages and overwriting them..

> Lua's design goals of speed, portability, small kernel size, extensibility and ease-of-use make it a good choice for embedded use on an IoT platform, but with one major limitation: the standard Lua RTS assumes that both Lua data and code are stored in RAM, and this is a material constraint on a device with perhaps 44Kb free RAM and 512Kb free program ROM.

> The LFS feature modifies the Lua RTS to support a modified Harvard architecture by allowing the Lua code and its associated constant data to be executed directly out of flash ROM (just as the NoceMCU firmware is itself executed).

> This enables NodeMCU Lua developers to create Lua applications with a region of flash ROM allocated to Lua code and read-only constants. The entire RAM heap is then available for Lua read-write variables and data for applications where all Lua is executed from LFS.

https://nodemcu.readthedocs.io/en/release/lfs/

That's still such a tiny amount of RAM, not nearly enough for PICO-8.

..Oh I see, the project PicoPico mentioned up-thread uses ESP32 Wrover with 4MB PSRAM - instead of Raspberry Pi Pico which it started with but didn't have enough RAM.

Well, having just seen the entrance of the rabbit hole, I can imagine the attraction of PICO-8 and working with such constrained systems - what a challenge!


The constraints fuel creativity, but it also pushed me to start writing a Lua compiler for PicoPico, which shows promise but is also a massive scope creep and mostly the reason I've not worked on PicoPico for a while


Ah, I just found your blog posts about PicoPico, wonderful! I'm going to enjoy reading them and perusing the code. Good stuff.

https://blog.davidv.dev/making-a-handheld-pico8-console-part...

https://blog.davidv.dev/pico8-console-part-2-performance.htm...


As much as I love Lua its very difficult to shoe-horn into an 8-bit CPU, especially with limited RAM... but there are other efforts to bring more modern languages to these platforms, and one that strikes me as interesting is dflat, from 6502Nerd:

https://github.com/6502Nerd/dflat/wiki

(See language description here: https://github.com/6502Nerd/dflat/wiki/2.-Language-Descripti...)

Maybe something like this could evolve/be adapted for continued modern development needs?


I dream of a pico8 clone, but with forth instead of lua....


There are some forth WASM compilers, right?

The Pico8-inspired TIC-80 project can use WASM, although it’s a pretty heavy fantasy console too. The WASM-4 project might be another option to look into.


Agon Light looks awesome! I like my ZxSpectrumNext a lot, so I appreciate these dedicated machines.


I've been playing with this for 30 minutes, and I'm still smiling my head off. It's just so much fun. I have used Pico-8 a bunch in the past (so it was easy to jump into making stuff). Pico-8 is one of four bits of software that I put it in my basket of "software that sparks joy" along with Aesprite, Blender, and Propellorhead's Rebirth.

Pico-8 had so much care put into its goals and intentional limitations: and so far Picotron seems to have that same level of love and thought. It's delightful, and I don't want to stop making things with it.

I've used many of the clones of pico-8 and they all feel like they miss the point. They "improve" on the limitations, but are just... not satisfying. Funnily enough, I've tried three times to make my own JavaScript version of what Picotron is ("what if I made a more feature-rich version of Pico-8 to use for prototyping in game jams?") and each time abandoned it because it felt like the Pico-8 clones: adequate, functional, but not inspirational.

I don't know who makes Pico-8 and Picotron, but hats off to you amazing person/people for making such likable software!


> "software that sparks joy"

I too put Aesprite in this category, but the big one for me is Godot. After years of from-scratch OpenGl projects and dabbling with Unity, I leaned into Godot 100% around 2020, and ever since it has been my #1 joy-sparking piece of software.


Around 2016 or so I had concluded that game dev has just stopped being fun, but luckily a friend talked me into trying pico-8. It's hard to describe what this little piece of software did for me, pure white magic! Just around New Year Godot finally 'clicked' for me and once again I am super excited to tinker and prototype. I'm almost too scared to try out Picotron now. Almost.


idk, i love Tic-80 way more. For me, the better aspect ratio, ability to use a different language, and not having to use a custom lua stdlib wins out


> Picotron apps can be made with built-in tools, and shared with other users in a special 256k png cartridge format.

I’m noticing a trend of newer indie software distributing assets in png files, what’s with that?


Aside from the fun factor of an image containing a runnable game/program, the PNG format is lossless, uses the same compression algorithm as ZIP, with encode/decode libraries in various languages. That makes it a good candidate for an application data format.


Picotron is by the same person as PICO-8 which is, to my knowledge, what made fantasy consoles popular.


it's fun and easy to share :)


In a world full of SERIOUS BUSINESS ALL THE TIME it’s nice to see something decide to be fun for the sake of it. It’s a cool digital homage to cartridges, which are basically also rectangles with cool graphics on them that run a game.


I would be fun to be able to take a picture of the png and have it load up the application. I know it’s more of desktop thing.

But even emailing scripts for work got flagged. This png format would probably avoid that.

Also good thing it’s lossless. Other wise those multiple save jpg artifacts could cause interesting bugs.


Other wise those multiple save jpg artifacts could cause interesting bugs.

There's a whole subculture that embraces glitches in gaming, graphics, etc. Folks run around collecting screenshots and videos of these ephemeral artifacts. It's fairly complementary to speedrunning gaming culture.


> I would be fun to be able to take a picture of the png and have it load up the application.

Reminds me of the time they would distribute tape deck based software via the radio.


It’s fun, mostly. Also, PNG has a handy alpha channel you can use to store data. I believe the previous console from this developer, PICO-8, started the trend.


It's a fun little thing but BEWARE! It's still a bit buggy and crashy* and rough around the edges. You can kinda see what Zep is going for but a lot of it is quite mysterious and there's little in the way of API docs (as-in, people are having to print all the global lua tables to figure out how to do stuff)

*Not as much as 0.1a but there's still kinks to be worked out for 0.1c.


Pico-8 was and is one of the most pleasant pieces of software I have used. I can only imagine the wonders the community will produce for this thing.

Of course, despite the machine itself (pico 8 that is, and this thing too) being proprietary, all the user-programs are source-available if not open source. It's really educational and I love it.

There will be compatible implementations of this thing, but the pico-8 tools were so refined, and pico-8 was so cheap, that I can't imagine not giving the dude 10 bucks. (i.e. the open source implementations might just run the program but not come with all the cute tools like the IDE, the pixel sprite/map/etc editor, or the music tracker), that was well and truly worth the money. Pico-8 is one of the only pieces of paid-for software I haven't hated.

Tl;dr: I think pico-8 is wonderful, I think the community and free programs are wonderful, and I think given that, this will also be wonderful.

I'm a fan and have been for a while.


Can you – or someone else – write about why Pico-8 is so much better than other fantasy consoles? In particular, I've been intrigued by WASM-4 recently, and someone else mentioned TIC-80 which also looks good. I remember reading about Pyxel and getting inspired. All three of those have the benefit of being free, so why would I pay for Pico-8?


Pay because it’s inexpensive and you are supporting the development of a platform that brings joy to a lot of people (including children). It’s hosted (splore for finding games), a community forum is maintained and is a wealth of knowledge. It’s a hub for learning. Paying for pico-8 is like donating to Wikipedia. Basically, you are putting a few dollars towards a “good thing”.


I don't buy that argument – why shouldn't I donate to TIC-80 instead, since it has the potential of reaching also children whose parents don't have $15 burning a hole in their pockets?

I'm not trying to be contrary, I'm really just trying to find what the unique thing about PICO-8 is since nobody has been able to articulate it, yet many people appear to feel it.


TIC-80 is heavily inspired by PICO-8. Supporting PICO-8 enables the creator of the original technology to continue producing creative works that seem to inspire a lot of derivative projects. Whatever the case, if you don’t agree, then don’t buy it. It’s pretty simple in that regard.


I think you missed the point, I perceived the question (which I'm asking myself too) how do these differ? What makes one more fun or better than the other?


I use both pico-8 and tic-80. I like both of them, but I like pico-8 better. Why? Aesthetics pretty much - and isn’t that enough? These aren’t tools to get things done; they are more like songs you listen to.


It feels a little like iOS vs android at that point.,,


Why not both? If you can afford it, of course.


That could make sense if both are equally good, or if it's down to personal preference and one has to try both.

But GGP made the argument out to be altruistic, that paying for one over the other is because it's better for the world. If that is the motivation, I would want to donate however much I could afford to the one with highest impact!


Not sure if you'll read this but I did end up buying Pico-8 mainly for the reason embedded in the middle of your comment. It really is a hub for learning. That feature seems undersold (or I didn't appreciate its value).


PICO-8 has a free online edition: https://www.pico-8-edu.com/


This is super useful to know about! The sprite designer & waveform editor / tracker is a really good creative introduction to computers for small children. And you can jump straight in to doing this with the above web link.

(For those new to Pico-8, hit 'esc' from the Lua console to bring up the editor tools, then click on the icon in the top right.)


The Pico-8 is great, but https://tic80.com/ is really cool too.


TIC-80 is cool, but it's a clone of PICO-8, and like Doom clones, some of them are great, but they're still not Doom.


I wholeheartedly agree with you that TIC-80 is not as great as PICO-8 is, and I would never recommend it over PICO-8 to someone who wants to start their adventure with game development.

But it is not a clone of PICO-8. It offers a resolution that's very similar to that of the Game Boy Advance, so it serves as a nice transition stage towards GBA development. You can then enjoy your games on a console like Anbernic RG351P that's optimized for GBA games (2x integer scaling, same screen ratio). It's a specific use case, but one where TIC-80 shines.


Just for information. The Powkitty rgb30 is the current defacto pico8 handheld because of it's 1x1 ratio screen running at 720x720 pixels.


Yep. I bought a yellow one with the intention of making it a dedicated PICO-8 machine, and it is wonderful. It's not as perfect as 351p is for GBA, as 5x integer scaling leaves you with some unused screen space, but still, an absolute joy to play.


>Pico-8 was and is one of the most pleasant pieces of software I have used

Indeed, but I have a gripe with it that I cannot get over, the editor's font is too damn hard to read, I tried get used to it but to no avail. The games however are very playable, fun, inspiring and the community couldn't be better.


I now only use vscode to code p8 files. And only use the IDE for everything outside code.

I'm too spoiled by modern text editors to accept the embedded one for any long time


> CPU: 8M Lua VM insts / second

Is that ballpark, or throttled for consistency? The FAQ has a "How Fast is the CPU?" item, but that just discusses being fast and faster than PICO-8.


It is throttled, although we’re still working out the exact details.

Practically, it’s not significantly more headroom than PICO-8 had because the screen is so much larger. You’ll have to use a low resolution screen mode if you want to do CPU-heavy things that wouldn’t fly in PICO-8.


I bet throttled since PICO8 does that https://pico-8.fandom.com/wiki/CPU


Man this feels great to me. The Pico-8 feels a bit too old-school and janky to me despite being a great bit of software, the picotron feels a lot more like my childhood. I'm excited to start playing with it!


Seems like this would be awesome on one of these Clockwork devices: https://www.clockworkpi.com/shop?page=2


I've had pico-8 on a Clockwork I borrowed, it works great for most applications!


The uConsole advertises support for pico-8 so it seems like they had this in mind :)


QR codes on cardboard slid under a cheap reader slot? cannot go past the 8 bit feel demanding some phsicality behind the thing. Lo-fi screen and giant buttons to mash..


It’s a bit of a shame that it’s apparently not fully compatible with PICO-8. I’d imagine it to be a perfect environment to create PICO-8 games.


In what ways? Maybe there is a chance to change it!


See https://www.lexaloffle.com/picotron.php?page=faq

> Picotron supports PICO-8 style shorthand syntax, almost the whole API, and other compatibility features that make it relatively easy to port PICO-8 cartridges. However, it is not designed to run PICO-8 carts out of the box, because the underlying machinery is quite different. For example, Picotron uses floating point numbers, and so can only approximate PICO-8's fixed point math behaviour.


One of the big differences is that Picotron supports floating point math whereas Pico-8 is all fixed point.


Don't know the details, but apparently "[it's] not designed to run PICO-8 carts out of the box". PICO-8 has a really low resolution, lower than a Game Boy, which makes it a bit difficult to write code in.


That would be really cool and a fitting level of meta if the Picotron could be used to develop PICO-8 games. I guess you could bolt a PICO-8 onto the side of the Picotron like one of those 90s console devkits.


Looking at this page makes me wonder if Mario Paint could be considered a "fantasy workstation".


It's not open source but it's really good looking, nice work!


I'm just wondering if there's a toy project for implementing the operating system for a sci-fi spaceship. Would it run on Kubernetes?


About ten years ago, Markus Persson, the creator of Minecraft, was working on a game called 0x10c which was going to be a sci-fi spaceship game where the various functions of the ship were controlled by 1980s-era computers, leaving the programming parts to the player to build. There was a community that spung up that wrote code, device drivers, etc. An interesting idea that died on the vine.

https://en.wikipedia.org/wiki/0x10c https://github.com/lucaspiller/dcpu-specifications


luv me 80s/90s computing aesthetics, 'ate 'aving to deal with the 'ardware to run software that 'as them, simple as.


> luv me 80s/90s computing aesthetics, 'ate 'aving to deal with the 'ardware to run software that 'as them, simple as.

Late shaving cardware Roombas


I like the idea of using constraints from hardware to drive software design, but the thing that always bothered me about pico-8 is that a lot of the model isn't fully constrained: As far as I could tell, the amount of memory available through the pico-8 lua interpreter is unbounded, controlled by the host OS.

Anybody know if the picotron is more tightly bounded in this way when it comes to memory usage in the programming system, and elsewhere, to turn it into a "true" constrained environment?


looks delicious. just bought one. Mac binary is not signed, is it intentional? (I'm fine w it not signed but just ask


Don’t you need an Apple developer account to get certificates to sign your stuff? If so that might explain it since that would be… what $300 a year? On top of likely having to go through the whole Apple Store acceptance process.


you don't have to submit apps for them to be signed by you, but you do need to pay 99$/yr, tbh i think it's fair considering xcode is free


I’m too unfamiliar with Xcode to know much about it. Do you need it to release software for macs?

I’m not sure how Apple gets away with forcing people to pay $99/yr to be able to let people install software without getting a warning. I guess it’s a minor issue. I have added a few installs to my “yes I really want to use this software” list on my m1 air, but I still think it’s a little bit silly. It’s obviously some sort of security feature, but Apple isn’t my mother.


It's an effective "barrier to entry" to level up software quality, or rather, to keep poor publishing out.

Apple gets away with it because Mac users tend to be much higher software payers than those on other OS.

Not saying it isn't some sort of business extortion


I consider XCode's price to be included in Apple's margins.


Are there any apps made for pico (other than games) that have broken through to the mainstream?


Depends on your definition of "mainstream" I guess, but picoCAD[1] got some attention outside of the PICO-8 world. Edit: Including here on HN[2]!

[1]: https://johanpeitz.itch.io/picocad

[2]: https://news.ycombinator.com/item?id=34101251


I believe the game Celeste started as a PICO-8 prototype


I'm a bit confused. I was about to buy this, but when I logged into my account, it looks like I already own it? At least the alpha.

I already owned a legit copy of Pico-8 and Voxatron...do I get it automatically?


I stopped reading fast once I realized they don't know what the relevant (and multi-decades-old by now) terms mean, or, simply didn't care if they abused them or confused people. Time too precious to waste on this.


PICO-8 I think has it's place still


When I use Raspberry Pi OS in a Raspberry Pi 4, 8GB of RAM - I feel I already have an excellent, refreshingly stable, late-90s-era experience. It scratches that strange nostalgia itch for that more innocent experience - of early-times WIMP computing.

I can surf the web, edit LibreOffice files, record audio in Audacity on my nice Rode microphone, watch video files in VLC, remotely VNC in, transfer files in and out over SSH's SFTP, etc.

Pretty much all that's really missing, to fill it out, is Zoom (or some such functional equivalent) with a fast-enough frame rate on video calls. And this is not, strictly speaking, the fault of Raspberry Pi, et al.


While all those pointing out that 8GB of RAM was mainframe stuff in the 90's are absolutely correct, I would offer that the software bloat between 90's software and modern software does make the _experience_ roughly comparable.

Except for the built-in HDMI video and the seamless plug-n-play networking that is...


I used 8GB RAM till recently and I've found it more and more untenable. The primary issue is the browser. Even now on 16GB I restart Firefox every couple of days. But other things also eat RAM like crazy

Running Emacs/Cider I'd have to kill other apps and reboot my REPL a couple times a day. Emacs would also leak memory and need a restart every couple of days


I guess the primary issue is the chair-->keyboard interface. The fact that browsers can open unlimited number of tabs doesn't mean you don't have to do a little housekeeping.

Plenty of people still use systems with 4GB and lower and it works fine as long as the number of tabs they open is limited.


The primary issue is that browser developers are people that can afford kitted out Macbook pros so the system isn't designed to scale to small/weak systems :))

I don't believe a browser couldn't be designed to have a small RAM footprint. All my tabs could be suspended and saved to disk when in the background (and not spinning any tasks). They can be read back into RAM near instantly when I tab back to them


No, Firefox and/or raspberryos just not well engineered for this usecase.

My Chromebook with 8Gb ram has dozens of tabs and web apps open in Chrome, runs one VM with Android and another VM with Linux in turn running Firefox and more. All without breaking a sweat.


I run Firefox on 8 GB without any trouble whatsoever. But I also rarely open more than a dozen tabs.


I use Firefox on an 8GB, early 2013 MBP, with hundreds of tabs and an extension, AutoTabDiscard that unloads/suspends them after a couple of hours.

Works beautifully. I have to restart the computer about once a month because of Catalina bugs, but Firefox is super stable.


> I have to restart the computer about once a month because of Catalina bugs

I shut down my laptop at the end of the day and turn it back on the day after, regardless of bugs. Why do you try to reboot it as little as possible?


It’s my home computer. It’s there to be used intermittently when needed at random times. It sleeps drawing almost zero power. Why should I shut it down?

Shutdown takes 20 seconds. Startup requires the FileVault password, then 20-30 seconds, then a login, then another 20-30 seconds until desktop is usable (and a few more until Firefox is).

If this was my work computer, it wouldn’t be so inconvenient to restart / shutdown once a day. But for what reason?


I guess that the main difference then is that I have to wait less to boot my machine. Did you consider hibernation? It would be a bit slower than sand-by, but then it would draw exactly zero power.


The startup/shutdown time is possibly explained by being an 2013 machine, but I have little reason to replace it right now - it’s only 8GB, old slow CPU (by modern standards), old slow SSD - but it does Firefox, thunderbird, the occasional Python script and a few more things perfectly well.

I’ll replace it when it breaks.

With respect to power draw - there is no simple way to force hibernation on Catalina AFAIK, but the power draw in sleep is minuscule - it hardly registers on my wattmeter (and e.g. it loses only 2-3% percent per day of battery while sleeping).


I mean nominally that sounds like his preferred experience?

I'm similar, I prefer maintaining state with things until I'm done with them, which makes the current models so frustrating, for all Apple talked about skeuomorphism, for me it's always felt so fake, it's only ever skin deep, I open a webpage and until I'm done with it, it should stay that way.

I've navigated down a third of the page? I've partially filled in a form field? Keep it! There's probably a reason I put that there!

It's not like when I put a piece of paper down on my desk it resets to it's original appearance and orientation every morning. It retains the scribbles and notes! Maybe you like someone else tidying your desk every morning, but I hate it!

Real things exist, our memories exploit these properties so well and what do we get, software that's all about returning to some pristine state that makes it harder for me to recall and use.

I'm curious if they're going to do this same thing with their spatial os, or whether they'll work out that persisting things until people are done with them is a feature.


I wasn't complaining about someone else's experience, I was just curious to see why he preferred it that way.

Sometimes I also use stand-by or, if it is to keep the state up until the next day, hibernation. But that's rare. In most cases, I just use Firefox's function to restore my previous browsing session. That would not restore half-filled forms, but I rarely deal with forms, especially long ones. As for reading or editing documents, most software will open the document at the point you where when you last closed it.


'Auto Tab Discard' was a game changer for me using firefox. I was about to upgrade to a 64gb laptop.


it helps but strangely doesn't eliminate the problem entirely


Perhaps I don't see it on 32gb but can relate with cider/emacs.


12GiB and copious swap. 4 profiles open with 50-100 tabs loaded at any time.

The only problem is the accumulation of CPU use from web apps.

Consider adding more swap space so that older tabs have an out-of-the-way place to stay.


i wish firefox had more granularity with its tab unloading feature. one method is restarting browser. but by default it’ll unload tabs if you’re low on memory. you can force em in about:unloads (hint- about:about if you ever forget)

i wrote a simple firefox extension that unloads em all with a button click, but it’s no different than restarting ff or spamming clicks in :unloads


We already know the solution. We run the browser in Wasm inside the browser. That way it only has one tab open and that tab is doing 99% of its work inside of a wasm env. I thought firefox would ship new versions of firefox inside of firefox at some point. I guess that point is still in the future.

This of course is how electron should work as well. A canvas only frame that loads whatever rendering system you want, which could be a browser, or it could be Unreal engine.


Isn't this what SDL (https://www.libsdl.org/) is for? Some cross-platform (and pretty light) hardware abstraction where you can have a canvas, do 3D, audio, whatever...


I can't tell if this is sarcasm or not, but assuming it's not...

How many levels of abstraction do we need to run software reliably? The fact that browsers have effectively become operating systems should be worrying enough.

No wonder we have all these fantasy projects that take us back to when our computing environments were actually pleasant to use. I would partly blame the invasive tracking and needless complexity of modern OSs for that, but the ever growing software layers around hardware makes no sense at all. We should question any such design decision, and strive to simplify this ball of complexity instead.


There’s AutoTabDiscard which lets you set a timeout and other rules for when to discard.


Is AutoTabDiscard still helpful/necessary with whatever automatic stuff Chrome/Firefox does nowadays? I run a lot of tabs (100s-1000s) and this sounds potentially very helpful if it does something beyond what the browser automatically does.


I think Firefox will delay loading on session resume (that is, if you restart Firefox, tabs will not load until you actually switch to them),but will not unload anything automatically - so if you open 200 tabs between restarts and keep them open (which I sometimes do, for a week or two until I close them) it makes a difference.


If you toggle this it will automatically unload, thought I'm not sure how low memory has to get before it triggers: browser.tabs.unloadOnLowMemory

In my original comment above I wrongly assumed it _was_ on by default


Try Sidebery extension.


NoScript and only turn on JS when required, and only for the site itself. Hugely reduces memory requirements


Really? I can only hold one firefox session at a time in 8GB of RAM, but I've always assumed that's because I keep an unreasonable number of active+background tabs open.


> 8GB of RAM [...] late-90s-era experience

Not even close.


I started my studies on 2003 and highest I could do was 768MB of RAM. I remember this amount exactly, because my LL(1) grammar compiler was leaking memory on my pre-presentation test, and I had to present it to pass the course. Every MB of memory counted, and it started swapping. I was praying to make it pass but when the amount of swap increased above the amount of RAM I gave up. However I was last in the queue to present, and the teacher told me he needs to go, as it was too late. I was so happy I barely could hold my laugh. Came to my dorm and fixed it next day. Core memory. That's how I also fell in love with my wife :D She was doing the extreme programming with me


Moat of us had about 16 or 32mb at the time IIRC


Unless you had RAMDoubler(TM)! Remember that?!


Not even close, if you told that about a ESP32 based system then I would have agreed.

My 2003 bought multimedia Athlon XP desktop had 512 MB!


The PC I was using in the late 90s had probably 32 MB of RAM after an upgrade... when I built a PC (2001?) with 512 MB it was looking like an infinite amount of RAM...


I remember just laughing when I heard that Adobe had fixed a bug that occurred when running Photoshop in more than 1GB RAM on Mac OS 9. It seemed like such a theoretical thing to have that much memory.


Zoom alone takes 2.4GB of RAM, just after being launched and starting a meeting - and no-one's even joined the meeting yet.


You weren't born in that late 90's era right? :D


Correct. 70's


It's a sign of the fact that personal computing has gone way, way off the rails that we make pretend computers to run on our real computers just to have fun ways to compute again. I really really appreciate work like this, but why aren't our actual operating systems "cozy" enough to support creative work anymore?


Remember the days when all home computers came with a BASIC interpreter preinstalled, and that was the first thing you saw when you started the computer? Later generations (Amiga, Atari ST) also had BASIC included with the OS. Not that familiar with the original Apple Macintosh, but from what I read that was the first computer to ship without programming tools. Windows then followed suit, and today all OSes ship without developer tools by default. Of course they're just a download away, but those are mostly tools for professional developers, so not really beginner friendly.

Also, the limitations of 8 bit (and 16 bit) computers also made them more approachable. I "designed" some cool-looking sprites (actually they were called "players") on my Atari 800 back in the day, although I'm not good at drawing, so I would be hopeless at producing something more hi-res...


"Classic" Windows usually came with DOS which included BASIC, with the main difference being that in Windows 95/98/Me it no longer had an editor, IIRC.

Original IBM PC in absence of other drive would attempt to boot from cassette and then drop you into similar BASIC interpreter - the "GW-BASIC" included in DOS was the same except it was shipped completely as file on disk drive instead of being ROM.

NT didn't have included programming language before NT 4.0 SP4, when WSH was added, it was also part of Outlook 97 and IE 3.0.\

The original computer to ship without any programming tools that was targeted at general population was Apple Lisa, I seem to recall mention of at least one loud consumer complaint if not lawsuit based around expectation that general purpose computer should have some tool included.


> with the main difference being that in Windows 95/98/Me it no longer had an editor

It was on the CD but it wasn't installed automatically.


Linux comes with Python included. (Python is the new BASIC, and explicitly designed to be so.)


Well it's for the same reason that Twitter is popular: intentional limitations that cut everyone down to the same height make something approachable and feel friendly. Nobody can excel on the Picotron, so it's inviting to try because you won't be comparing your work to someone else who did something so much more impressive. Likewise in classical Twitter nobody could write a truly great tweet due to the character length limits, and that set the tone and encouraged everyone to get involved. Compare with blogging on something like Substack where people who might otherwise publish something end up comparing themselves to Scott Alexander or Matt Taibbi and concluding they can't compete.

I think in computing there's the other issue that modern programming has a big focus on safety and security which was absent in the 8-bit era. If you sit down to make a Mac app you're not only going to compare your work to Apple's own, but you're also going to be constantly distracted by things that aren't "fun" like slow compilers, type systems, notarization and code signing etc. These are all important for people who use computers as end users but if you just want to hack about and make something they suck away the energy.


> intentional limitations that cut everyone down to the same height make something approachable and feel friendly

I wonder if generative ai might someday have a similar effect? Imagine a "make me a game" tool, with LLM-like "Fortnight, in space, with cute animals, and classical music". Ok... "the default music sync with action is fine, but as health declines, make the tone darker. And give my dog an oboe theme." Removing design-space cliffs, scattering defaults and highways, adding exoskeletons, as alternatives to shortened horizons. Kids today finger paint with pigments that would be the envy of painters past who ground their own - "use only charcoal" still has a role, but... there's also neon pens with sparkles for diaries, stamps in kid paint programs, and ... . Imagine a future coloring book, with speech to text to outline image, collaborative coloring, and "ok, now make that a 3D rigged avatar, skinned in the style of an oil painting". Making it easier to fly around the space, rather than lowering the ceiling.


> Nobody can excel on the Picotron,

Uhhh... have you seen Pico-8 development. People can excel on that thing. The limitations make the achievements even more remarkable. If you want to see the excellence in coding, combine the two and check out the people who wrote BBC BASIC raytracers in a tweet. If anything, we're in a glut of shitty code today partly because our comparatively powerful machines, combined with a race to the bottom in terms of churning product out quickly, make writing and shipping something extremely unoptimized far, far easier than taking time to polish the end product.

I think you're onto something, in that the Pico-8 and Picotron are going for the "vibe" of retro home computer/console programming but are not capturing the true essence of it. With 8-bit home computers, you started off in BASIC and could build simple games and stuff -- but if you wanted to write anything performant then you had to drop down to assembly and there was a significant difficulty spike there. So even back then we were dealing with "unfun" stuff. (In general, the enjoyment you got out of such work was proportional to the effort you put in.)


Yeah, I know. I had an early Acorn machine as a kid and couldn't figure out how my favourite games were made. I was aware they weren't using BASIC, but how they really did it was a mystery. And 3D graphics like Elite left me foxed. I tried to do my own but had never heard of trigonometry so that didn't go far :)

Even so, the span in which you can excel is far more limited. Nothing stops you making 8-bit graphics today (see Notch) but people and especially kids will compare what they can do to, say, Call of Duty and lose interest when they realize how far away they are. Micro games at least tended to be made by one person, so it was theoretically possible to get that level of skill yourself.


Because a computer is a general-purpose tool. A computer is not a box made to be cozy and support creative, limited programming work.

If you're looking for specific use-cases, that's exactly what userland software is for. Userland software takes the general computer and converts it to something specific. If you are looking for a cozy environment that supports creative, limited programming work, you run userland software for that!

It's like software-defined networking except software-defined creative environments. Some people prefer Photoshop, and others Picotron. The computer gives you the choice, and userland software is the mechanism by which it does so.

If anything, I'd like to turn your observation around: isn't it marvellous that the same machine allows one person to run Photoshop and another Picotron, with almost no change required to switch between the two environments?


> A computer is not a box made to be cozy and support creative, limited programming work.

That's a pretty hard line you have drawn there. There is no reason why it cannot be that. There are several open source window managers which tried to have a vibe. KDE had a cozy vibe. We have a Hanna Montana Linux, which was definitely awesome as a kid. I find it obnoxious that society has decided these infinitely flexible machines will have the personality of an iron smelter.


"Cozy" and "general purpose" are not mutually exclusive. Emacs is very cozy, and it can do frickin' everything.

(Maybe Emacs is not cozy to you. But the fact that it is to a significant number of people explains why it still has fans in a world where Visual Studio Code is eating everything.)

Picotron is supposed to emulate a particular form of cozy: that of the very general-purpose computers of the 1980s and 1990s: classic Mac, Amiga, Atari ST, even Windows. What I'm lamenting is a sort of fall from grace wherein even Microsoft, of all companies, tried to shape the computing environment to bring support and comfort to the user rather than exploit them and introduce churn and friction for its own sake.


I agree, why do so many think that an immersive computer environment that makes the full power of the machine ergonomically ready-to-hand is some kind of retro thing? It sounds like a futuristic improvement to me. 40 years ago we had bicycles for the mind. Today I want a Kawasaki h2r for the mind, but the tech industry wants me to ride the bus.


Others have mentioned a limitation-creativity link. But I wonder if there's also an implicit... "impedance match", to the current state of interface devices? "We'll make it more creative and popular by requiring physical punched cards! Think of the lovely chunkchunk-chunkity-chunk sounds!", or "You have to hand punch holes in paper tape!", would seem unlikely. On the other hand, decades-old ux is well matched to decades-old current keyboards.

When I wanted my own laptop more "cozy", without the silliness of "you can only press two keys at a time, so no chords", and "most of it isn't a touch surface, and can't even tell which finger pressed were on the cap", and "it's oblivious to hand pose and gestures above the surface", and "the screen is only 2D and can't even tell where you're looking", I had to kludge the entire stack from hardware to apps. If you could sculpt, dance, and sing code, perhaps 8-bit might have less appeal? Like the appeal of entering programs with faceplate bit toggles instead of a keyboard?

Maybe. Counter argument: pico-8 mobile/tablet. Counter counter, historical state of pico-8 mobile/tablet??


> Others have mentioned a limitation-creativity link. But I wonder if there's also an implicit... "impedance match", to the current state of interface devices?

No, it's a software (& hardware) design issue. Computers just aren't made to be tinker-friendly anymore.

Eg. back in the day, I had a trio of editor+assembler+debugger on MSX2 (often running from RAMdisk). For many programs, edit-assemble-test cycles were a few minutes at most. With nothing loaded, machine would boot into BASIC seconds after power-up.

So: develop on target device, even with that being Z80 based machine with ~256 KB RAM (which was already comfortable). Several vendors of these MSX machines would send you a full schematic / service manual for a nominal fee. Hardware mods were commonplace. Youngsters who'd never touched a computer could be tweaking BASIC programs within an hour. With patience you could wrap your head around the whole machine.

Nowadays: boot computer, wait, click on fancy icons. No default programming environment(s) in sight. 'Poke' some hardware port? Not happening. Modify any of the built-in software? Forget about it. Or at best: first download multiple GB's of development tools, spend the next week(s) buried in documentation. Not for the faint-hearted. Let alone newbies.

Yes, computers have become faster. But also more complex. Some of that complexity is justified. Or even necessary. Much of it is not, and is just heaps & heaps of technologies / abstraction layers & legacy cruft.


> tinker-friendly [...] complexity. Some [...] necessary. Much of it is not, and is just heaps & heaps of technologies / abstraction layers & legacy cruft.

Nod. That silly only-on-my-own laptop project had a device-driver->full-screen-browser stack, so "Reimplementing wheels - sigh. But no libinput, linux gesture mess, window managers, xlib/wayland, ... - oh dancing-lightly-through-tulips yay!".

I enjoyed lisp machines, which handled complexity differently. DonHopkins comments on a LispM ergonomics thread:[1] "It was not just the hardware, or the software, or the culture, or the interesting problems you could solve, or the zeitgeist of that time in history, but a rich combination of all those things and more, that is so hard to capture, describe or reproduce -- or even believe, if you haven't experienced it first hand. [] Those giant keyboards, with all their wide special purpose buttons topped with hieroglyphic keycap labels, in combination with the huge screen, three button mouse, and of course all the great software turning on and off the little dots on the screen that you could dive into, explore and modify at will, the printed and online documentation, the networked developer support community, all carefully designed to work together seamlessly regardless of cost, gave you the feeling of being in control of a very heavy, expensive, well built, solid, powerful, luxury automobile, with rich [...]".

I guess I was wondering if hardware-wise, part of what allows pico-8 to retain appeal, is we've largely stalled out on a half-century-old keypress-and-mousemove ux plateau. If it used some other "similarly" old interface tech (front-panel bit switches, paper tape, terminals as forms), the appeal would seem less.

The appeals of pico-8 and lispm seem somewhat related. But for lispm's cutting-edge "luxury car" power-user-ness. Perhaps the -8'ness constraint, and resulting bounded goals, is what makes doing the stack tractable (witness smalltalk but-do-you-have-a-library-for-X struggles)? But they've also struggled with adoption, which the Scratch-like lower-barriers-from-lower-ceiling might help with?

The smalltalk folks, with a (also keyboard-mouse) vision of full-stack, but with high-ceiling kid appeal, have struggled with both. And, maybe, XR might offer windows where society's UX and approach to software is more malleable.

So what are the implications for designing a powerful full-stack system that attracts a broad user base? To take advantage of some possible "XR is now gelling - as with phones, we're about to do a giant societal software rewrite - the course of future societal computer-and-software UX is briefly malleable"? Ideally one that instead of mac/lispm tractability-through-singular-hardware, has complexity-handling-power sufficient to blend Z80-to-insaneXR hardware-software diversity into something accessibly/appealingly/cozy tractable?

[1] https://news.ycombinator.com/item?id=7893008


i've been writing an interface like this called aesthetic.computer and trying to work out a full stack for it - i love the ideas in your writing


tnx! aesthetic.computer: demo[1], live[2]. Includes chatbots... hmm, so maybe Chat(with app) will someday be as familiar as Save and Paint?

[1] https://www.youtube.com/watch?v=S-7UszmI1K4 [2] https://aesthetic.computer/ Then press "Enter" and enter "list".


Limitations and difficulty are the foundations of creativity.

Our current devices are almost unlimited.


I like to start up Dosbox-X or one of the virtual Amiga environments that comes bundled with Amiga forever. Definitely cozy.

More often I use some old application, like the nowadays BSD-licensed ex-Autodesk Animator. It is fun to figure it out and more fun than modern applications in many ways. I even bought an old used book about it and read cover to cover. Limited compared to modern graphics software, but "cozy" is a great way to describe the experience.

https://github.com/AnimatorPro/Animator-Pro


Minor quibble about this screenshot: https://www.lexaloffle.com/dl/wip/picotron_desktop2.png

The wallpaper is named "triplane," but that's a biplane.


I'd rather one of the many open source alternatives to that ecosystem.


However, TIC-80 only exists because PICO-8 does, and without money, presumably PICO-8 couldn't've been made, the dude would have had to be doing other work to live.

Previously this guy made Voxatron, which I imagine paid for PICO-8, and that presumably paid for Picotron, so if I don't buy Picotron, then perhaps I'll prevent his next work of art from coming to fruition?

Yes it bothers me a little bit that PICO-8 itself isn't open source, but I can't see the alternative, otherwise how can the dev afford to be spending time thinking about and working on these new things?

It's not as though this is a huge company, or that there are alternative means of generating income from it (no Enterprise wants PICO-8 support, for example).

I don't see an alternative to giving the dev a few bucks to keep making art projects that I love.


The pico-8 is very reasonably priced and for a few extra bucks you get Voxatron. I'm a huge advocate of open source but the pico-8 is just so lovely, and the community so creative and accommodating, that I didn't mind contributing. I have yet to check out the TIC-80 but I plan to after getting a little more fluent on the pico.


But TIC-80 just is so much better than PICO-8. Not just the resolutions (which people can argue are an aesthetic choice for PICO-8) but the fact that you aren't limited to lua but have a variety of languages (some Lisp inspired) in TIC-80.


Having such specific limits is exactly the point of PICO-8 though. If I wanted a variety of options, I’d be using a more traditional engine or library.


Do you have any examples? I’m pretty curious about Picotron, but would love to try an OSS alternative.


There are plenty of alternatives you could find on [1] in the context of fantasy console, almost all of them, oss or proprietary, active or dormant. And honestly many of them were inspired by PICO-8.

Disclaimer: I'm one of the contributors of the list of [2].

[1]: https://github.com/paladin-t/fantasy

[2]: https://github.com/pico-8/awesome-PICO-8



Picotron looks to be a different product from pico-8.


This looks similar to what the cool couple at 100 Rabbits [1] are doing with Uxn. Overall, I love to support anyone producing hobby / cute software (especially with Lua!).

[1]: https://100r.co/site/uxn.html


The fact that they seemingly write all this stuff from a sailboat makes it even cooler.


Last I checked it was mostly solar-powered too. That's pipe-dream stuff.


I want to thank you. Your comment sent me down an hours long rabbit hole (pun intended) last night into their collective and I've frittered away many more this morning pricing out sailboats I can't afford! Thank you so much <3


Funny, the first time I read about uxn I went down almost the same rabbit-hole and was investigating local used sailboat prices within an hour.


Potato (which uses Uxn) is even more similar to Picotron in that it's like a little desktop environment:

https://wiki.xxiivv.com/site/potato.html


Somehow having a Ranma 1/2 picture open in a window fits absolutely perfectly with my actual young life.


uxn is awesome, and the implementations I have looked at seem to be MIT licensed.


Does anyone also think these "is a" headlines violate commonly accepted headline rules? Arguably it should read: "Picotron, a Fantasy Workstation"


Slow night?


It's 'representative text from the article' in this case plus it probably doesn't matter in most other cases.


Apparently not.


What rule?


Æsthetics




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: