Hacker News new | past | comments | ask | show | jobs | submit login
Classic NES Series Anti-Emulation Measures (mgba.io)
209 points by kibwen on Feb 1, 2017 | hide | past | favorite | 75 comments



From the "Dolphin Progress Report" article (https://news.ycombinator.com/item?id=13544514): there were two Disney games that wrote garbage data to a region of memory where important data was stored, but not enough to cause the CPU to flush the cache, and then invalidated the cache, causing the writes to never actually happen on real hardware, but on Dolphin (which didn't actually implement a CPU cache in its emulation layer), it caused the writes to actually happen.


That is awesome. I'll have to remember that one.


That's super clever


All this effort to claim possession over something. Imagine if all the effort regarding possession and wealth were used to make something actually useful. What a waste.


A ridiculous portion of GDP is wasted on this type of hostility. Just look at how much marketing is completely unnecessary, but billions are spent on it. Look at how many companies are spending money to reinvent the wheel constantly instead of sharing code, designs, etc. Patent lawsuits. We're just scratching the surface here. It's a depressingly wasteful and hostile world.


Aye, it's tough to see it as anything but a waste too - it gets bypassed and that's it, all time, effort and money that went into it was effectively for nought.


I actually believe the exact opposite.

That so many useful innovations are lost & forgotten, because there was no incentive to market & distribute the innovation.

It is lost on many, that the usefulness of an idea does not guarantee it's popularity. It is very, very hard to get visibility & traction for even the most useful of inventions.

Protections on ownership & wealth create the incentives to invest in creating new innovations, but also to productise, distribute and market those innovations.

It is only a small fraction of products that can exist without the proper incentives to build out the whole chain from idea to r&d, productisation, to distribution & marketing.


I believe somewhere in between. The usefulness of an idea doesn't guarantee its popularity, but it also doesn't guarantee its profitability, or the timeline where it will become popular or profitable.


The fashion industry seems to do fine with relatively little draconian protections - https://www.youtube.com/watch?v=zL2FOrx41N0


But they're not doing anything useful, just taking advantage of human nature.


Run the Jewels just released their latest album entirely for free, and hit #1 in sales anyway, so


That a counterexample exists doesn't really impact the claim that "many" (though they imply "most" imo) do not work this way.


Indeed: Not everybody is the kind of person who pays for Jonathan Coulton's music instead of downloading it for free.


RTJ also have an established brand - I can tell you 100% I only know of them because of their previous marketing, I've maybe heard one song. It's like how Radiohead did a pay-what-you-like for their album and it came out being viable, it only works because of their previous commercial success through more traditional routes. If they'd self-produced and made Pablo Honey PWYL they'd probably be unheard of.


Perfect summary of what's wrong. I often wonder what would happen if the best minds were trying to solve the world's problems instead of getting people to buy things they don't want.


It is difficult to take a binary program and establish intent. I'd rather call these "interesting and unusual programming tricks" than "anti-emulation measures" until more evidence is presented.

Mirrored memory is a side effect of unconnected lines on the address bus thus making the content of those bits irrelevant. Code can take advantage of this to run faster or put tag values into addresses.

On a GBA, VRAM is faster than ordinary RAM. Programs can do well to use it for tight inner loops.

Using STM (store multiple) to DMA registers? Again, go faster.

Save type masquerading might be code that helps when running on a development kit, but I admit that I can't think of what use it might have.

Self-modifying code that depends on the pre-fetch queue might be the best place to look for intent. Might be easy to tell if the program is doing it for some larger purpose or simply to fail subtly or overtly if it sees unrealistic processor behavior.

Any why would a program do extra work writing to an audio FIFO than need be?


Mirrored memory is a side effect of unconnected lines on the address bus thus making the content of those bits irrelevant. Code can take advantage of this to run faster or put tag values into addresses.

What does this have to do with speed?

On a GBA, VRAM is faster than ordinary RAM. Programs can do well to use it for tight inner loops.

Almost all programs use IWRAM for this. It's one of the things it's for. It's as fast if not faster than VRAM (depends on if the VRAM is being accessed by the PPU at the time).

Using STM (store multiple) to DMA registers? Again, go faster.

In retrospect, yeah this might actually be the case.

Save type masquerading might be code that helps when running on a development kit, but I admit that I can't think of what use it might have.

There are a handful of GBA games that lie about save type as anti-piracy and refuse to save or even boot if it finds the wrong one. It's also really the only anti-piracy technique that any other GBA games actually use, since it's quite effective against flash carts.


> What does this have to do with speed?

You generally need to mask potentially-tagged pointers before dereferencing them. (Ab)using virtual memory or unconnected lines lets you skip the mask, eliminating the cost in any code that's otherwise unconcerned about the tags. This in turn may let you save a byte or register here and there (and thus save memory bandwith / potentially spill fewer registers, maybe saving some performance).

GCs may abuse pointer tagging for keeping track of what they scaned. Ruby's VALUE type is pointer sized, and will point to Ruby objects such as strings and symbols - but it can also directly represent e.g. a 31-bit integer value ('Fixnum') on 32-bit systems without needing to be dereferenced, and without needing to consume a separate type field.


Also used in early revisions of the Amiga. (BCPL or Amiga Basic used pointer tags for something like garbage collect or frame pointers or something IIRC.) Mac had something called 32 bit clean which I imagined could have been pointer tagging.)


"32 bit clean" meant that the ROM didn't use pointer tagging; otherwise it was using the top 8 bits for its own purposes and wouldn't run on machines which used the full 32-bit address line.

http://lowendmac.com/2015/32-bit-addressing-on-older-macs/


Self-modifying code that depends on the pre-fetch queue might be the best place to look for intent. Might be easy to tell if the program is doing it for some larger purpose or simply to fail subtly or overtly if it sees unrealistic processor behavior.

Indeed, in the PC world taking advantage of the prefetch queue as a sort of "loop buffer" was relatively common in demos, where a loop would patch instructions to execute in the next iteration, squeezing out a few extra cycles. Here's a detailed application of this trick:

http://www.reenigne.org/blog/8088-pc-speaker-mod-player-how-...

...as seen in this awesome demo:

https://trixter.oldskool.org/2015/04/07/8088-mph-we-break-al...


That demo is insane. Is there some kind of archive that is keeping track of them?

It would be sad if they were to be lost in time.


pouet.net and scene.org

In particular, http://www.pouet.net/prod.php?which=65371


Yeah, putting tight loops into fast RAM for extra speed is a very old trick --- I've done it myself. Likewise multiple stores.

Using a non-standard copy of the address --- well, it's an emulator, on a slow system; the ARM requires 32-bit constants to be read from a constant pool. If it can use certain addresses that are cheaper to construct, somehow, that'd be a performance boost. Can't tell without knowing which addresses, though.

My first thought on the save type masquerading and the pre-fetch queue testing is that it's testing for particular hardware. e.g. if it's running on a cart with SRAM, do the SRAM thing, otherwise do the flash thing. Likewise, testing the pipeline size might be trying to figure out what processor there is. That doesn't explain why it just crashes rather than following some other code path --- if the code to do the SRAM thing was there, and the emulator tells the game that there's SRAM, then the emulator should see the game doing the SRAM thing.

It might be something as trivially stupid as that the game contains the code to check for development hardware, but that the run-time support for the development hardware isn't present and instead the game is just crashing. There may not be anything malicious here.

If I wanted some sort of antipiracy or antiemulation feature, I wouldn't put a big obvious crash up front. Instead I'd introduce some sort of random failure elsewhere in the game, so it superficially looks like it's working, but isn't any fun to play...


Instead I'd introduce some sort of random failure elsewhere in the game, so it superficially looks like it's working, but isn't any fun to play...

This isn't mentioned in the article, since one of the anti-piracy/-emulation techniques I didn't discover at the time of writing it (due to my dump being an overdump) is that many of the games do this. They screw up input so it either boots but you can't play it at all, or input is unplayably slow. It detects it by having an interesting memory mirroring quirk in the cartridges that no other GBA carts have.


This is one effect I discovered when I was working on getting around the anti-piracy effects of an arcade game[1][2] in order to run it on similar hardware. I think it is a more clever copy defeat mechanism than stopping the game from booting.

[1] http://mikejmoffitt.com/articles/0047-puyopuy2.html [2] https://tcrf.net/Puyo_Puyo_Tsuu_(Arcade)


> Instead I'd introduce some sort of random failure elsewhere in the game, so it superficially looks like it's working, but isn't any fun to play...

Doesn't that defeat the purpose? If people don't realize that they're being punished for pirating, you're just collecting bad review scores and not pushing anyone to buy a legitimate copy.


It's actually a not-unheard of trick. Arkham Asylum, for example, would have a failure case about halfway through the game where Batman's cape would fail to open when he jumped down into a deep (plot-required) pit if the game had been pirated. Batman would crash into the ground and die, the users would take to developer forums or the steam forums to complain that they couldn't get past this one section because of a gamebreaking bug, and then the developers would say, "Yes, that's an anti-piracy measure. If you purchase the game, it won't happen".

Another game, Game Developer Tycoon, would run as normal, but as you got further and further along, in-game pirates would pirate all the games you made and your profits would keep on dropping. People came to the developer forums to ask for ways to keep people from pirating their games, because they couldn't make any money because of all of the pirate. The irony was lost on some.


I think the Game Developer Tycoon one is the only actual success story, and that's because the story went viral.

For every person who goes out of their way to complain on the forums, there's probably five that just caution their friends not to buy the game.

I also recall (obviously difficult to verify) accounts from people who claimed that the Arkham Asylum (I recalled it was City...?) bug happened to them with legitimate copies. From a development perspective, an "Easter egg" of that sort requires a LOT of QA effort.


This is exactly what games like Spyro the Dragon do - the cracker would give up after fixing n protections


Yes, it makes it more time consuming to fix all the protections. It's like all debugging, the crash bugs are usually easier to fix. It's the little ones a long way in which are harder to fix.


I'd take a guess that the save tricks might be more anti piracy. After all the gba had flash carts, perhaps these carts contained sram and eeprom so checking for it was to block it. No idea about the pipeline stuff though, perhaps checking vram was working correctly?


Particularly given the "NES Classics" line, I would expect weird stuff to happen because these cartridges are effectively themselves emulations of games from different hardware.


> I’m not really sure why Nintendo went all out with these games, considering that these are just ports of NES games.

I've always suspected that Nintendo cares a lot less about PC emulation, than it does about mostly-chip-compatible knock-off hardware.

Think of those consoles you see at Walmart that claim to come with "100 games built in!" Those frequently contain chip-compatible designs of Nintendo's old hardware, and a library of ROMhacks of NES games (or the ROM from a single one of those 100-in-1 NES carts.) They're not emulating Nintendo ROMs; they're just running them, directly.

The manufacturers of these consoles never bothered to clone newer ones after the NES, because all the demand for these knock-offs seems to be some weird combination of nostalgia and clueless-parent value-purchasing (e.g. "oh hey, it has Super Mario Bros on the box! That game was great. My kids would like that!")

But the GBA is just as easy to clone the internals of as the NES is (ARM7 cores are just as easy to find—and cheap—as 6502s), and has a far larger library of games (~17000!) So if these companies could transition to a GBA chip-compatible design and still ship these NES ports, they'd increase value immensely, while still being able to put NES nostalgia on the box.

And so, for these ROMs that might have been just the thing to spark this switch-over, Nintendo went to some extra effort.

It would have been funny, had any of these knock-off manufacturers already finalized a hardware design based on tests with other GBA ROMs and started up their logistics pipeline to assemble consoles, had they flashed one of the NES Classics ROMs onto their new-off-the-line console, and realized that it was far more stringent than other games were about faithfulness to the GBA's architecture. It might be enough to kill a whole company.

---

Of course, none of this ever materialized, because for some reason, the knock-off manufacturers are still just making consoles that are chip-compatible with NES games, rather than speccing out builds based on what chips have become equally cheap since then. Who knows why.


Of course, none of this ever materialized, because for some reason, the knock-off manufacturers are still just making consoles that are chip-compatible with NES games, rather than speccing out builds based on what chips have become equally cheap since then. Who knows why.

Actually, there are now quite a few Chinese GBA clones on the market, and my understanding is that they do not use emulation but are hardware-compatible reimplementations of the GBA hardware.

Some examples:

http://www.k1gba.com/

http://exeq.ru/produkcija/pristavki/detskie/gamebox.html

http://obscurehandhelds.com/2010/08/the-nintendo-game-boy-ad...


If I could find a good one, I just might buy it, too. My AGS-101 isn't exactly an ideal system. But then, a GBASP might be cheaper...


The 101 is the most desirable SP because of the nicer backlit screen, so don't just get rid of it.


Wait, did I say AGS-101? Sorry, mistake on my part. I meant my AGB-001


Part of me wonders if this is just some experienced game dev working on a boring porting project that decided to have a little fun and see how he could cause issues with emulation, not some large sanctioned measure to defeat emulation software. Might explain why this effort went into such a seemingly unlikely game.


If I were a game dev, I would want my work to be preserved past the life of the specific hardware it was written for. It's sad that so much culture is lost due to proprietary hardware and software.


True.

I recently "bought" Titanfall 2, on DVD. I put "bought" in quotes because I wonder how much I actually own it, if at all.

It wouldn't run at all without first connecting to the internet and downloading multiple gigabytes of updates.

I wonder what would happen in the future when EA decides to shut up shop or change their DRM platform. Will I ever be able to play it again?

This is why I often feel like pirates "own" the content more than legitimate buyers.


Yeah. Compare this to the 90s games I've been playing: I can still take a copy of Q3A or UT from '99 and install it on my computer (although why you wouldn't use the IOQ3/UTPG patches is beyond me). Ditto for Doom and Descent (although you need a src port). Thief, Deus Ex, System Shock 2, and Baldur's Gate are a little harder to get running (you have to use wine, and in the case of Deus Ex, figure out how to get the shoddy programming to work with modern hardware), but it's still doable.

Will that be true of FTL, Bastion, Shadow of Mordor, Limbo, or Assassin's Creed?


I know that I've got Bastion and Limbo as DRM-free downloads through Humble Bundle. I don't have FTL, but they sell it (likewise DRM-free). Does FTL have some kind of online component, or something? Any reason that it couldn't keep working in the future?

I get the point, though. I've got dozens and dozens of games from the 90s and 2000s, and the earlier ones are much less likely to have a reliance on some external authentication server, or something.


Better yet, the Q3A engine is open-source and is still maintained in the form of ioquake3. Just look at how many architectures it's compiled for on Debian. It's going to stay forever with us without emulation.

https://packages.debian.org/sid/ioquake3


I did mention that...


In general, id Software games (at least up to Doom 3) will be around the longest, since the engine's code is available under the GPL.


And Descent, because that's GPLed too.


Deus Ex works fine on my PC - though I have (I think) the New Vision mod as well.


It works, all right, but they bind game logic to framerate, and then don't lock the framerate, so you have to go in and set a framrate lock or the whole thing will be unplayable.


That's why I've been buying GoG versions over the steam version of games where possible, they're DRM free. Only real issue is the selection isn't as vast and generally doesn't get a lot of the latest games (due to them having all sorts of draconian DRM e.g. Doom 2016).


I play my games through steam, but often buy through Humble or GOG when I can. Mind, I can't always do that (and will buy through Steam when there's a sale on), but it's insurance when I can.

For really old games, I usually just buy the disks outright.


But your work here is a port. You don't have any creative input to make you feel attached to it, and it's already going to be preserved in its original (NES) form.


That would actually be an explanation! They had nothing else creative to do, but to go all in with the directive they had from mgmt: "oh, and put in some kind of copy protection". Copy protection you said? Yes sir, will do...


The fact that it's a port makes me approach it from the other side.

Since the original code was designed to run on NES, I wonder if these were hacks to work around issues with NES code running on GBA rather than a deliberate consideration to stop people from porting your port.


I've been reading about the gameboy's architecture and internals (in the hopes of making a game with it), and it's pretty amazing how much specialized hardware there was in earlier generations of consoles, especially compared to the more general modern consoles. And yes, new consoles have GPUs, but those are everywhere.

Your computer has a GPU. It probably doesn't and didn't have scrolling registers or hardware sprites (unless, like the C64, it had a gaming focus).


There is an excellent video from 33c3 about Gameboy internals and all the wonderful tricks that can be utilized when writing software for it. https://media.ccc.de/v/33c3-8029-the_ultimate_game_boy_talk


C3 never ceases to surprise me. Thanks!


One could at least describe the "text mode" of VGA/CGA/EGA video hardware as actually being the same kind of "tile mode" you find in consoles. And in slightly more modern video cards (the kind you'd find in a Windows 95/98 machine), the mouse cursor was usually a hardware sprite. (Even today, mouse cursors seem to frequently escape OS gamma correction for some reason...)


And even without a dedicated scrolling register, John Carmack famously managed to get the IBM PC to do scrolling by manipulating the screen buffer address (and emulating sprites by manually redrawing those parts) https://en.wikipedia.org/wiki/Adaptive_tile_refresh


My first computer was an Acorn Electron, and that had hardware scrolling, too.

I don't think it had hardware sprites, but game programmers of that era often used compiled sprites to improve performance.


Does Legend of Zelda: Minish Cap have something similar? I played that game on a raspberry pi but it always freezes and dies when I enter a certain room in the final boss castle... Not sure if just a bad port or a security feature... I would love to finish it, but don't feel like replaying the rest of the game just for the ending...


Which emulator?


I want to say it was running Retro Pie? Edit: I don't have it with me, but I think it was the mGBA emulator, sorry.


That's not an emulator, it's a frontend. You'd need to know the core, e.g. mGBA or VBA-Next.


RetroPie is a collection of emulators (and also distributed as a special-purpose Linux distro), mostly ported to the libretro API, and all being accessed through the emulationstation GUI.

For GBA emulation, RetroPie has the choices of "gpSP" (either through libretro or independently), "vba-next", and "mgba".

edit: Sorry, jpfau, saw yours after refreshing.


I wonder if these tricks to defeat emulators have the opposite effects. They give truly interesting chalenges that can motivate talentuous developpers.


Yep. See, for example, 4am's cracks of the Apple II software collection (https://archive.org/details/apple_ii_library_4am), done half for the sake of preserving old games, but also half for the sake of discovering (and documenting!) the many kinds of anti-piracy measures that went into them.


Yeah, ironically, they could lead to the emulator developers more accurately emulating the original hardware. Then other games may actually emulate better!

It's not be practical to actually make one, but theoretically, there's nothing you could do to stop your game being on a 100% accurate emulator. And some good emulators are amazingly close.


> talentuous

I like the word, but I think you mean "talented". :)


Article is from 2014. Suggest adding info to submission title.


Indeed, I even posted it back then: https://news.ycombinator.com/item?id=8808754


I find it very interesting to read about these sort of tactics, given that not only does their design require a high degree of expertise with console hardware (which Nintendo's developers obviously have), but also a high degree of familiarity with emulators themselves in order to understand the techniques that are likely to defeat the common optimizations and shortcuts that emulators make in order to make up for the discrepancy in speed.


If I remember correctly, Nintendo also implemented tough DRM like this for the GBC Video series. Same light gray cartridges, I think.

(GBA Video was a short-lived series of video content available on GBA carts. I remember having a cart with an episode of The Fairly Odd-Parents on it. The video quality was terrible. Nothing worth protecting.)


Why bother? If these are ports of NES games, couldn't they just emulate the NES versions?


Welp, I commited the sin of posting before reading. That was the last paragraph of the article.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: