Hacker News new | past | comments | ask | show | jobs | submit login
The Polygons of Another World: Atari Jaguar (fabiensanglard.net)
281 points by masklinn on March 13, 2020 | hide | past | favorite | 69 comments



I'm a former Jaguar developer too, and reading the article was a strange experience, remembering things from ages ago. I'd forgotten most of the details, as well as how it felt.

Also the Atari ST mention. I'm getting all sorts of 80s and 90s flashbacks from that story, of what it "felt" like to program on those machines.

All those fiendishly clever timing and register hacks to make the video and audio hardware do things it wasn't designed to do, well outside their design parameters, and other tricks to make 8-bit and 16-bit machines generally do things nobody thought possible, until it was.

That feeling is something I miss a bit, as it's difficult to find tricks that make the "impossible" possible on modern hardware.

[Edit: I suppose Spectre and friends qualify :-)]


The stories of the ST (AMY chip http://www.atarimuseum.com/computers/8bits/XL/ASG/Chips/AMY/), STE botched launch (https://www.youtube.com/watch?v=oarR61SeY8E), Falcon (https://www.youtube.com/watch?v=cBTXGgb__y4), Lynx (https://www.mentalfloss.com/uk/games/36270/atari-lynx-the-wo...) and Jaguar (see other comments in this thread) all make great stories!

Some of the stuff that were done on the machines is jaw dropping - realtime raytracing on the Falcon for instance: https://www.youtube.com/watch?v=_cKRZ8QgH5o

Doom on the Atari ST (not Wolfenstein, more sophisticated with heightmaps, lighting, 64 colours etc): https://www.youtube.com/watch?v=QCvx2O5M69E

Complex texturemapping on Atari ST: https://www.youtube.com/watch?v=gHwUchzEG8k

Polygon landscapes on the Lynx: https://www.youtube.com/watch?v=DPexnRsNJDs


You forgot stock 16MHz 1992 Falcon Quake 2 engine https://www.youtube.com/watch?v=WpwlZgQPCpk


As a former Jaguar developer, I'm getting PTSD just looking at the diagram with the four different CPUs, all on the same memory bus in constant contention, all with their own unique machine instruction set.


I didn’t know much about this, but was surprised to read ”a Motorola 68000 […] running at 14Mhz […] two 32-bit RISC processors running at 26.59 MHz“

I wondered how those could share a bus, with that weird 1: 1.899 ratio between CPU frequencies. I think https://en.wikipedia.org/wiki/Atari_Jaguar#Processors answers that. It says the 68000 ran slightly slower, at 13.295 MHz. That makes the other clocks exactly twice as fast.


Thanks for pointing this out. I did not realize how important it was to not round up.



Yeah, but it's weird for devices in different clock domains of the time to share a classic parallel system bus. They "can" theoretically, but practically they didn't because of the latency that cross domain clocking imposed.


One of the motivators behind GOAL (Lisp dialect used on Jak & Daxter games) is to allow programming the PS2's various processors very close to the metal in a single language.

The Jaguar might benefit from a GOAL-like language of its own.


Andy Gavin has a great interview on the technical-side of developing Crash Bandicoot [0], which they wrote in GOOL, the precursor to GOAL. In fact, they wrote GOOL explicitly for Crash Bandicoot. Later, when writing Jak & Daxter, Naughty Dog developed GOAL. Sony ended up buying ND -- among other things, they were hoping to use ND's code, and were quite surprised to discover that it was all written in a homebrewed Lisp dialect.

You can see some GOAL here: https://web.archive.org/web/20070127022728/http://lists.midn...

[0] https://www.youtube.com/watch?v=izxXGuVL21o


I just saw that interview with Andy Gavin last week, and I highly recommend it. The title is "How Crash Bandicoot Hacked The Original Playstation" and it truly fits. So much of what they did seems to have been crazy and beautiful performance hacks. Anyone that's had to work with severe limitations to achieve their goal and found crazy ways to make it work will see a kindred spirit.


I saw this interview, too. It blows my mind that Crash himself accounted for a third of their budget for onscreen polygons, something like 550 polys, which is more than many PS1 games managed to render for an entire scene.


They also didn't use any texture memory for Crash; only shading. This went against all received wisdom, and shows how much they were ready to face every challenge posed to them by going back to first principles.

They also wrote a software-based Z-buffer implementation. And a custom compression system that allowed vertex-based animation. And scene streaming from disc. And...

https://all-things-andy-gavin.com/2011/02/02/making-crash-ba...

Previously: https://news.ycombinator.com/item?id=9278082


Nitpick:

> Crash was 512 polygons in the first game, with textures only for his spots and his shoelaces, and his model didn’t change much through the 3 platform titles. It took me a month to settle on the perfect 512. As Andy said, we went with non-textured polygons instead of textured ones on most of the characters. Instead of texture, we used corner colors to create the textures that seemed to be there.

> There were many advantages to this strategy. The simplest was that we got more polygons. But we also solved a texture stretching and warping issue inherent in the PlayStation’s renderer that tended to make textures look terrible. Since you spent most of your time looking at the character, and he could get quite close to the camera, avoiding texture mess meant a lot for visual quality.

> And there was another important issue solved by using polygons instead of textures. The PlayStation tended to render every polygon as a pixel, no matter how small it got. Had Crash’s pupils been texture, they might have disappeared when the got smaller than a pixel. But by making the pupil 2 polygons (a quad), they almost always showed up as long as the total eye, including whites, was more than a few pixels tall. Subtle, but trust me, it made the game so much more clean looking. It’s the small things that matter.

Source: Jasons comments in part 3 ( https://all-things-andy-gavin.com/2011/02/04/making-crash-ba... )


Thanks, I'd misremembered something read almost a decade ago.


The two Lisps Naughty Dog developed had different (heh) goals. GOOL was more of a scripting language; GOAL was developed for close-to-the-metal programming and was used to write much of the engine for J&D.


While we are on the subject of Lips-likes for game systems there is a recent effort for the NES which is intriguing. And has been discussed here on HN of course. Just search the archives.

http://www.dustmop.io/blog/2019/09/10/what-remains-technical...


Yes, I think some of that complexity was due to interference by Atari, they definitely took a dump on the Handy/Lynx by changing the power circuitry and the size of the unit.

From what I've read, the joypads were lifted straight from the panther, as were the launch games, rather than be redesigned.

The CPU was always meant to be 32bit with a cache, Atari chose the 68000 for ease of porting games over, but cheaped out rather than choose an EC68020 which would have increased the bus bandwith by 4 times.

Same with a lack of a CD unit which if it had come with one would have helped attract developers - the costs of ROM carts made it very unattractive for most dev houses.

Additionally, having the designers do a proper SDK would have helped, especially with the direct memory access bug which would be mitigated with an official workaround.

Lastly, the chipset was designed to be able to use 2 memory banks, so having 1meg+512K would've been better than 2meg on one bank.

Obviously having more time to develop the chipset would have been benficial. I believe Atari insisted on the object processor in addition to the DSP and blitter, which the designers were not keen on, and really by that time had been demonstrated to be a bit of a dead end (Atari GTIA->MARIA->Copper vs tile/sprite-based VDPs or blitter-based processors). If they'd not put one in maybe the chips would've been ready earlier/better, or could've had some cache/registers for the blitter.

Even without the chipset improvements, a handful of changes would've made things a lot easier for developers, and faster to boot.


> EC68020 which would have increased the bus bandwith by 4 times.

I thought it was only 2x (16 bit bus on the straight 68k). It did also have a cache though which likely would have helped muchly. Wikipedia does indicate the EC68020 was considered for the Jaguar 2. My gut would be that the EC68020 was considered 'too expensive'; back then 130k extra transistors still meant a few bucks from a cost standpoint, and Atari was already cutting odd corners in a desperate stupor.

That bus is terrifyingly bad IMO, I remember developers lamenting it in interviews I've read over the years.

One of the -smartest- things Sony did with the Playstation was picking an architecture that was known to be capable of 'pushing graphics' and licensed/reused what they could from SGI's system designs. Probably a big part of why, while still challenging for it's age, it was at least -sane- compared to what was involved on the Saturn (lol, quadratics) or the Jaguar.

Nintendo, while late to the 32 bit party, also went with an SGI based architecture, but stuck with ROM carts which still cost them a bit of business.

Funny, the TG-16 suffered from many of the same problems as the Jaguar (specifically, questionable CPU bus sizes and quirky hardware design). Alas, it seems nobody from Atari payed attention to that lesson.


Yes you're right, technically it's 2x bus speed moving 16->32 bits, but I was guestimating the effect of the cache to allow the 020 to stay off the bus ie. interleave cache and bus access.

What would have made more sense would be for the object processor be dropped, so Tom and Jerry could be placed on the same chip with some scratchpad ram to share amongst the processors. You'd have a big cost saving on a single chip system, and you could also move to 1.5mb dual bank ram probably at price parity.

020 probably would've been too expensive, but you needed to go large or go home in the console world, I'd have gone with CD to drop the ROM costs so hopefully over a year projection the upfront hardware cost would amortise with the CD savings. A single chip might have allowed an 020+CD at a $300 launch price rather than $250.

The Saturn architecture is a completely bizarre kettle of fish even vs the Jaguar! I can see where the pinch points were for the Jag, but the Saturn looks as though they redesigned the machine twice, and without actually having any input from the AM2 team (Virtua fighter). Certainly from a systems perspective there's a lot of scope for making it more efficient.

I kinda of thought the TG-16 was straightforward from a programmer perspective? The custom chips had a few things they could do and that's it. The demos I've seen (pouet) basically seem to use the hardware as it was intended ie. no fancy tricks that you can coax out of it. I'd say the SNES was more crippled by bottlenecks, probably stemming from the initial backwards NES compatibility that never materialised?


That sounds like the next zachtronics game.


Honestly I would love to see the Jaguar become the target of game jams and high-profile ports and stuff. That would warm me heart.


One of the things I would love to see would be a voxel-based fighting game. The rage at the time was for texture-mapped polygon fighting games, but the Jaguar was better at voxels (see phase zero https://www.youtube.com/watch?v=2OPcFdkj0GU).

TBH I reckon the Atari ST and Amiga could've handled a voxel fighter at a decent clip: https://www.youtube.com/watch?v=CmnZoUxXIQM at about 2:46


It has an extremely active indie development community, still. They’re mostly huddled at AtariAge.


I would love to hear more about the experience of programming with the Jaguar. Do you have any interviews or sources for knowledge on this?


I find the Amiga graphics are the nicest, then the Jaguar, then the 3DO's are so terrible and overwrought I never would have liked this game if this was the first version I played.

Nostalgia? Maybe. But I find the minimalist polygon graphics of the original a big part of what made Another World a big hit. At least to me. In my mind this is related to the concept of "closure" as described in "Understanding Comics": the more abstract, more symbolic the graphics, the more my mental processor can give it hyperrealistic meaning in my mind; conversely, the more detailed the graphics, the less interesting they become and the less room for my mind to "complete" them. The beast as a black outline with eyes and fangs is terrific (and terrifying). The beast as a more detailed rendition would probably just look silly.

Plus, you know, the polygon graphics look objectively cool ;)


I was an Amiga user, and I certainly can see why you prefer it, but the Jaguar ones look great as well (something I'd never have admitted to back in my Amiga days when Atari was "the enemy") - it feels like pixel art made for the resolution.

The problem with the 3DO one to me is that it looks like it was just downsampled from a high resolution image rather than was being drawn for the resolution it was rendered at.


> back in my Amiga days when Atari was "the enemy"

To the best of my recollection that was the first major fanboi war in computers, also amplified by the advent of pre-internet online community in the form of BBSs (Bulletin Board Systems). I fondly remember accessing GEnie[0] from my trusty Atari 1040ST (my first home computer, since it had midi builtin).

However, I have to admit, the whole fanboi thing struck me as silly, since Amiga’s were graphically superior, while Atari ended up attracting midi developers and users. Not until much later did I realize, that in marketing terms, a fanboi war may very well benefit both sides. So while I still shake my head at fanboi posts, I also see why manufacturers like those wars.

[0] https://en.wikipedia.org/wiki/GEnie


To the best of my recollection that was the first major fanboi war in computers, ...

Definitely not the first major one! I fondly remember endless back and forth discussions between proponents of the magnificent Commodore 64 and adherents of the despicable ZX Spectrum.


So close but deeply wrong. The Spectrum was the path of goodness and light and the C64 was the route to beige coloured damnation.


I disliked the Speccy back then: I owned a C64 and my friend a Sinclair (which was a Spectrum clone if I remember correctly) and back then I thought the C64 was so much better: better sound, better colors, less "artifacts" in color graphics...

... now I see the Spectrum and I have a newfound respect for it. It was technically inferior but there was a certain grace in how game devs worked around its limitations and weird color palette. And the graphics were crisper than I remembered.


Sinclair Research (after sir Clive Sinclair, the founder) was the developer and original owner of Spectrum. The classic Spectrum models are either from Sinclair or a version developed by Timex (who also did a lot of the Spectrum manufacturing, I think) under license.

There were a number of clones too, but Sinclair is the real deal and Timex Sinclair was officially licensed.

Later the Spectrum was sold to Amstrad - at which point you got the Spectrum models that looks like Amstrad computers.


it's strange for me now, looking back and appreciating the Commodore 64 for what it was - the colours that I thought muddy and awful as a child now seem considered and more broadly useful, the capabilities of the SID chip blowing away what the Spectrum was capable of.

And then there was the Amstrad - I simply can't recall its graphics being as good and as colourful as they clearly were. Blinkered fanchild shades, I guess!


Trust your instincts as a child the C64 pallette was and remains horrible.


The Spectrum was surely just meant as a doorstop, just like the Oric 1


I consider this article so much in the spirit on those kind years : http://www.alfonsomartone.itb.it/fztsmo.html


Haha, that was funny.

Though obviously misguided. The C64 was superior. Commodorian 4ever here!


I was always surprised at how Amiga went under the radar during apples 2e and Microsoft dos days.


Commodore destroyed their dealer network in the US by mistreating them during the C64 days. The worst probably being Jack Tramiels parting shot as Commodore CEO when he cut recommended retail prices overnight without warning dealers or offering any refunds/discounts to deal with the massive drop in the value of their inventory he caused.

Conversely in Europe where local Commodore subsidiaries did well in many countries, Apple was a tiny presence most places until the Mac.

I pretty much lived in the local computer stores in Norway in the 80s and never saw any Apple 2 variant for example. Apple was a non-entity to me, even though I knew of obscure brands like Dragon and Enterprise 64.

Commodore dominated the home computer market in Norway, with Atari, Spectrum and Amstrad following shortly after, and a variety of smaller players..

Once the Mac arrived, one store nearby had a single one on display in the corner.

These things varied greatly by country.


To add to vidarh's testimony, in Portugal Apple devices were sold by Interlog, they had two offices across the two main cities Porto and Lisbon.

So no one one islands would get to see one, and on main land it required actually driving down to those shops to see them, or you would eventually see an ad on a computer magazine and eventually schedule a delivery to your place, that could take several weeks.

On top of that they were very pricey versus the competition, which was already quite expensive for average Portuguese salary.

So the only place you would actually see Apple live would be in some computer lab at the top universities.

Regarding DTP, most companies were using Atari or Amigas for it.


I think it’s the same as the effect of a really good book and a really good tv film adaptation.

I can count on one hand where the TV/Film adaptation lived to my imagination - most recently been the expanse and altered carbon both of which absolutely nailed the source material.


I'm torn in the middle, but I don't totally agree. I had never seen the 3DO version before now, but I really like how it evolves on the original layout. I think the artwork is really interesting and "other worldly" while still retaining some of the abstractness. Now, part of this might be due to being familiar with the original, and having an exciting alternate version to see, I'm not sure if the game would be as notable without the polygon aesthetic.


I know exactly what you mean! This is why, to me, the text adventure genre was always so fantastic.


The link skips over the part explaining that this is part of a series about how Another World was implemented on many platforms: http://fabiensanglard.net/

Also, in this video https://www.youtube.com/watch?v=tiq0OL8rzso&t=2m the person who implemented the SNES version tells the story first-hand.


Funny how Sony did a similar thing (and had similar problems) with the Cell architecture of the PS3.

The developers who took the time to really target the platform (Naughty Dog? Kojima Productions?) managed to extract a lot of performance from it, and PS3 really felt like something from the future for the time, but it made developing and porting to the platform much harder.

(The difference is that the Cell architecture was more like cluster architecture, just focused on multiprocessing, while this Jaguar seem to have unique components each with unique quirks)


It's more analogous to the PS2 and it's pipeline of arcane custom chips. All of them had their own flavor of assembly, including a few chips that handled 2 opcodes per line, each operation with different cycle latencies. These chips had to be driven by the DMA and synchronized with double buffers, to ensure you had a decent fillrate of polygons. The CPU had a 'scratchpad' which you could access in 1 cycle (as opposed to 30-200 from main memory depending on cache misses), and would also be piped in by the DMA. Also, one of its vector processors was accessible (slowly) directly from the main CPU as a coprocessor (the so-called macro mode), but could also be driven by the DMA (micro-mode). The main processor and the input/output processor were two flavors of MIPS, the latter being the same as the PS1 presumably to provide emulation.

I'm certain the PS2 architecture was written by a mad genius - and it took one to use it's system to the max. If it wasn't for popularity the PS1, it might have gone the way of the 3DO. The PS3 was a dream to work with in comparison.


On the Jaguar, I was employed to write a custom engine for a 3d fighting and exploring game with some unusual art: https://en.wikipedia.org/wiki/Legions_of_the_Undead

My approach to squeezing the most out of the Jaguar was to start by getting a fairly complex fused tiled-polygon-and-texture rendering algorithm working on the 68000 first, written mostly in C++ for GCC, with some of the classes generated in Lisp.

To get all the geometry and other details right and focused on feeding calls to a texture-scan-line inner loop.

Then to switch the inner part to a fast, asynchronous command pipe, pushing commands to the GPU/blitter et al. which would read and execute those as fast as possible. And then as needed optimise any bottlenecks on the 68000 using assembly.

The idea was to keep the GPU/blitter as busy as possible filling pixels from simple texture-line commands, with minimal logic to fetch and setup each new blit (i.e. no higher level geometry calculations during which the texturing would be idle), while the 68000 ran in parallel generating those commands and doing the higher level geometry, which was quite complex in our game.

I got everything running just right on the 68000 for a nice demo, with the graphics and maps we had ready by then. It was visually perfect, and play speed was just about usable but not smooth like we were aiming for.

Unfortunately perhaps, that's when Atari called my boss in to do a milestone demo, so that version was shown.

When my boss returned, I was told whichever Tramiel was head of Atari at the time was "angry" as the game looked "too slow", and the project was cancelled. :-(

Just as I was getting the GPU/blitter accelerated mode working, which was projected to be 2 weeks work (since everything up to that point had been aimed at this).

I never did get to complete or show off the fast version. So close!

Atari was in trouble and collapsed very soon after, so maybe that wasn't the real reason.

We then started moving the project to the Fujitsu FM Towns: https://en.wikipedia.org/wiki/FM_Towns

And then the PC using DJGPP.

The geometry, texturing and physics systems could be moved over because they were mostly GCC-compatible C++. (At the time different C++ compilers accepted different dialects.)

But all the effort around optimising the engine around pushing pixels as fast as possible out of the Jaguar's hardware subsystems had to be abandoned.

Fortunately the pipeline architecture translated quite well to x86 texturing, first using awesome integer register tricks (very few registers on x86, but because you could address individual bytes and words inside dword registers you could "vectorise" some of the texturing and address arithmetic); later the FPU became faster.

This was in the days of Doom, Quake and Descent, when extremely hand-optimised assembly software rendering was normal on PCs, and consoles had very strange "not quite right for 3D" blitters. (Ask about the Sega Saturn sometime.) GPUs as we now know them were just starting to be created for PCs, and Microsoft launched DirectX a year or so later.

--

The Jaguar's CRY colour space was not great. We had a very large graphics asset pipeline at the time, derived from hundreds or thousands of photos of real physical models made by the artists (a very different look than drawing), and all of it had to be scaled, alpha-clipped and quantized into CRY, which was not kind to the colours. Adequate, but not great use of the colour bits available.

Somewhere I'm pretty sure I still have a "pnmtocry" executable in Linux a.out format :-)


I just wanted to say that I enjoyed this comment so much. As a Jag owner at the time I remember being taken by the production shots of that game that popped up in magazines, and couldn't wait to see it in motion. The craftiness necessary to pull off a compelling rendering pipeline on the hardware of that time period, combined with the aesthetic directions the constraints pushed designs in, still inspires me. (And somewhat indirectly led me down my own career path.) Thanks!


Hey, thanks for the uplifting comment!

If we had published on the Jaguar, maybe that would have led to a whole different career for me too :-)

It's lovely to think someone outside the company enjoyed it. Because it was never published, it felt like multiple years put into something that just disappeared, and I didn't get much feedback from outside the company. That happened with the next game I worked on too, Mr Tank, which was more technically advanced and a lot of fun to play in the demos.


I'll bite... what about the Sega Saturn's blitter?


Bearing in mind I didn't work on the Saturn. What I recall from others, for 3D rendering, the Sega Saturn:

- Could only draw quadrilaterals with any corner coordinates, but not triangles. You need triangles to draw general 3D polygon shapes. Triangles could be simulated by setting two corners to the same position, but that doubled the amount of rendering due to the next points:

- Drew quads in a strange way that overwrote lots of pixels repeatedly if they weren't screen-aligned rectangles. Instead of drawing horizontal spans on the screen to the bounds of the shape, as almost anyone else's renderer would do, so that each pixel was rendered once, it drew lots of sloping lines, starting with a line between points A and B at the "top" of the quad, working its way towards points C and D at the "bottom" of the quad, interpolating the endpoints and slopes appropriately.

- Drawing the sloping lines would have been ok (though not as efficient for RAM access) if it had managed to still draw each pixel once by careful choice of line pixels. But instead, it just drew sloping lines on top of each other, with the effect that you got a combination of some pixels being drawn over more than once inside the quad (wasted time), and also gaps where no pixels were draw inside the quad (holes!).

- To avoid the holes you had to ensure the line density was enough that you would definitely also have significant pixel duplication. The line density depended on the polygon's longest edge and its orientation after projecting from 3D to screen space.

- When drawing a highly distorted quad that meant a lot of pixel duplication at the more "pinched" end.

- Calculated texture coordinates by linear interpolation along these lines, instead of 3D projection. This made the texture look strongly distorted on quads not facing the viewer in 3D space, needing a workaround in software.

- And then there was the difficulty figuring out how to get the hardware to actually do these things from the poor documentation and an enormous list of some 200 registers, without useful sample code or libraries.

Drawing a good looking 3D textured polygon was tricky on this thing. Even when you had it drawing polys they looked wrong in all sorts of ways until compensated for, and still looked a bit wrong no matter what you did.

And then you just knew it was much slower than it could have been given the rest of the hardware, bus speeds etc. It seems like the hardware designers had put a lot of effort into a complex chip, but unfortunately didn't understand much about 3D graphics.


Thanks for sharing, very interesting story.


The color space is probably one of the stranger things I've seen. I wonder what led the designers to choose that scheme. I'd love to see the design considerations for this whole system.


It's not conceptually that different from YUV[1], the color spaces used by PAL television and most JPEG images. I can't speak to the Jaguar, but both of those decisions were motivated by the desire to completely separate the luminance/brightness dimension from the 2 color dimensions. For PAL this was to maintain compatibility with black and white televisions during the slow adoption of color, for JPEG it's a compression trick since the chroma channels can be downscaled more than the grayscale without humans noticing as readily.

In any case I'd argue the ability to reason about this space is a bit more natural than RGB (and a bit less than HSV) too.

But I'm sure there were some other specific requirements they were trying to satisfy...

[1] - https://en.wikipedia.org/wiki/YUV


YCbCr (often called YUV for... reasons) and 4:2:0 chroma subsampling (half the vertical resolution for each plane of color information as for the brightness) are still the dominant color encoding used for digital video, HDTV, Blu-ray, etc.

The "CRY" model used by the Jaguar is fairly different than (though as you say, related conceptually in its separation of brightness/darkness information, of course). I think it basically just arose from a combination of big focus on smooth shading while keeping the color values workably small and fast to calculate on.


It is fundamentally different than YUV (PAL)/YIQ (NTSC). In those systems, the IQ/UV regions are monotonic. In the Jaguar color space, axes have different meanings depending on which quadrant you're in. In the (-1,-1),(0,0),(1,-1) triangle the vertical dimension appears to have no effect.

When Nintendo natively used the YIQ color space for the NES, that makes sense, because it simplified the hardware, because it's the color space used by NTSC. It's not apparent to me that there's an easy way to convert this non monotonic color space into YUV/YIQ.


Thanks for the explanation and the link! I'd imagine it would make shading a little more computationally quick, now that you mention separating color from brightness.


The idea was to use a lot of gouraud shaded polygons (i.e. polygons with a gradient that goes to black) and the color space was designed, with the blitter chip, to make those kinds of polygons fast.


The URL links to the #back_1 anchor/id and ends up half a page down.


Are there any particularly good interviews with Jaguar developers? I love getting into all the tiny nerdy details of old game systems.


Maybe someone could interview drcode from the top comment!



>Three years later, with the Jaguar project ahead of schedule, Atari decided to abandon the Panther and released its 64-bit machine in November 1993

so ahead of schedule they released Broken hardware (for example SRAM scratchpad was supposed to be cache, but that didnt work) with Broken SDK on top (generated code had tendency of randomly crashing).


If you want to see how the gameplay looks, check it out: https://www.youtube.com/watch?v=wjMf_bEfqIc

This is the PC port so I'm not sure if it looks exactly the same as the original...


That’s the 20th Anniversary edition; it has been graphically enhanced from the original. In that linked video, you can see at 3:22 where the player, probably by mistake, switches back and forth between the original graphics and the enhanced version, but just for a second.


amiga graphic ages the best.

3DO is nasty looking


I love this series, it's just so comfy (and informative) to read.


Thanks!


I agree with the parent post.

One really interesting game to analyse technically would be Bioforge. Golden era MSDOS game that used some not seen before tech.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: