Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft admits to signing rootkit malware in supply-chain fiasco (bleepingcomputer.com)
280 points by shp0ngle on June 26, 2021 | hide | past | favorite | 130 comments



Microsoft needs to stop signing kernel level anti cheats completely. They are always rootkits by design.

No game should require a kernel driver.


It also just keeps happening. A recent game(Genshin Impact) shipped with an Anti-Cheat driver that had unauthenticated read/write primitives via IPC for debugging and it's been abused since.

It's a security theater.


Can Microsoft revoke their signatures on kernel drivers? If so, why haven't they revoked Genshin Impact's anti-cheat driver now that it has been proven to be a vulnerability? Is there a technical reason, or purely business?


It's a very difficult problem to solve. Once you sign something, it's by design impossible to repudiate that signature.

That in turn means you need some other way of "revoking" or "repudiating" that signature. In the case of TLS for example, it's a certificate revocation list.

I don't know if there's such a mechanism on Windows Kernel.


Whoa, that's a really scary example that would be great to point people to to show why this stuff is so bad. Can you link to more info about this?


Here's a PoC for it on GitHub, you can read the code, it's pretty much just an IOCTL for each operation. Check Driver/MhyProt2.cs.

https://github.com/kagurazakasanae/Mhyprot2DrvControl


Yikes! Thank you!


Microsoft would not want to stop since it keeps users locked in to Windows. The majority of games now support Linux according to ProtonDB, but the few popular games that don't are almost always due to a kernel level anti cheat.

https://www.protondb.com


I love proton, I sure you like it too, but it's not an existential threat to Windows in any way.

People always take the path of least resistance which is Windows on the laptop/desktop that they bought at Best Buy. They won't reinstall their OS because they don't know what it is.


It won’t affect the average user, but the only thing keeping some people on windows is games that use anticheats like BattleEye or EasyAntiCheat.

With proton a lot of games now run quite nicely asides from the ones that require anticheat systems.


Right, but now you're trying to solve a completely different, unrelated problem. The problem with this particular situation is that it hits the average user, not folks who already know what ufs is, or are mad enough to install linux just so they can game with proton.

(and no, already using linux as daily driver doesn't count, none of those folks were even remotely impacted by this windows-manufacturer-signed rootkit-on-windows problem.)


Windows has such a diverse set of users, I don't think any one thing could ever be an existential threat to Windows, not even Windows being unseated as the most popular OS for PC gamers. But nonetheless, that is clearly one battle Microsoft intends to win, even if they could afford to lose it.


They might buy a Mac, though. As far as evil software megacorps go, I kind of prefer people using Apple products to using MS products.


How do you feel about the monitoring of every executable you run, when and where?


Are you suggesting that MS does this or Apple?

Is this known behavior documented somewhere I can learn more about?



> The majority of games now support Linux according to ProtonDB

This is misleading IMO. The dashboard doesn't do a good job of letting me know what proportion of mainstream games (from Ubisoft, EA, etc) are supported. And what the performance delta is.


That, and I believe it's for specific setups, e.g. Ubuntu 16.04 with 3xxxx nvidia binary driver, and may not work with your specific configuration (newer Ubuntu, different distro entirely, AMD GPU instead of nvidia, etc). While useful, I definitely agree with you that it is a bit misleading


Talking out of school here, but games currently seem to require such low level hardware access that this is rendered impossible.

No matter if your game has crappy memory management and the escalation exploit roots your XBOX, an update will quash that. But people are banking and buying/mining crypto and then launching fortnite or whatever.

Game security is nuts. It's always been a training ground for RE/binary exploitation engineers, so thats a plus.


> but games currently seem to require such low level hardware access...

lol, no. The industry has converged on a small number of engines, which is why you see so many cross platform games - back in the day that usually required subcontracting out the task of porting. Also, what is it that you imagine they need to poke so deeply in hardware that they need to bypass the kernel?


Gotcha, I suppose I was thinking about the hardware and other drivers. Games sometimes implement special audio, network, video stuff.

Point taken, do you think there is any reason they can't run with locked down privileges or is it all anti-cheat?


It is all "anti-cheat". Many years ago gamedevs like John Carmack were no joke wizards pushing the envelope, which certainly required low level access. Today's gamedevs aren't doing that... not even close. Modern games are known for shamefully janky code - and it isn't motivated my performance, see the GTA json parsing travesty for example: https://news.ycombinator.com/item?id=26296339


While there’s no question Carmack was an exceptional talent, let’s not forget the original Doom shipped with serious network killing bugs in 93:

https://doomwiki.org/wiki/Broadcast_packet_meltdown

Making software is hard, I’d not be so quick to judge the GTA devs, especially when GTA games are massively larger development undertakings. Doom was made by a handful of people in less than six months - GTA most certainly wasn’t. There is such a vast difference in complexity I’m not sure there is much value in comparing. The entire doom source code is likely smaller than a single grand theft auto save file.


> serious network killing bugs in 93

hmm, having a hard time determining if I'm simply defending an entrenched position or if the fact that they were the first to write FPS netcode might excuse the fact that it didn't initially run super great on corporate IPX networks that wired everything together with hubs instead of switches :) I vaguely remember hearing about that bug, and I very likely played the game prior to getting my hands on a version with it patched out. Never noticed a problem in the computer lab, but then we were still on token ring - where collision lights weren't really a thing.

> The entire doom source code is likely smaller than a single grand theft auto save file.

Is that a defense of GTA, you think that "complexity" is needed? I don't think so, I think it is emblematic of something very wrong in not just software - but in the mentality one would need in order to look at the depth of our average stack traces and think "Meh, 25 calls deep isn't bad for text file pretty printer".


I think it’s pretty ridiculous to think something more complex than doom isn’t needed in a game simulating a vast number of systems like GTA, yes. Bugs and issues increase with complexity, surprise!

Doom is gloriously simple by today’s standards - it doesn’t even support “room over room”. That isn’t a criticism of today being over-complex either - the state of the art in gameplay systems has just moved on massively as compute power has increased. We take “room over room” for-granted in 2021 which we couldn’t in the 90s.

Separately, if you can manage a software release with 1000 human developers, multiple release platforms and millions of users over 5 years (like GTA 5) and not ship random little performance bugs, you need to tell the rest of the world the secret of how you did it.


> I think it’s pretty ridiculous to think something more complex than doom isn’t needed...

That isn't what I said, it isn't even close enough to pretend that it isn't a bad faith interpretation designed to strawman an argument where you don't look totally ridiculous. Here, lemme help you get back on track - you said: "The entire doom source code is likely smaller than a single grand theft auto save file." You think I might have been addressing that?

> Maybe you are still ok with 640kb of memory too though? ;)

Oddly enough, I'm currently writing an (eventually) open source firmware for an ancient APC SmartUPS that used a variant of the 8051 MCU... I'm finding the 16K ROM and 256B RAM pretty roomy. I expect the other two people who eventually flash it onto their hardware in the next 30 years will be very pleased.

> random little performance bugs

lol, writing a microtransaction json parser that increases load times by an order of magnitude, for millions of customers, for years - and is so braindead that somebody with all the resources of a freeware decompiler and a stopwatch can do your job better than you... that isn't a "whoops, more seizure t-poses", that is a "I care so little about the quality of my work that I'm not even gonna bother dumping a flamegraph for that 10 minute freeze everyone has to sit through - for years."


> The entire doom source code is likely smaller than a single grand theft auto save file.

Just checked, "linuxdoom-1.10" (https://github.com/id-Software/DOOM) is 1.26 MB, my GTA save is a little over 500 kB. The GTA save files are actually quite small, likely because most things in the game world are emergent behavior of the game's systems and aren't permanent state (like in e.g. TES games, which are notorious for large save files, as well as save file corruption bugs due to "oops we didn't think saves would grow bigger than 16 MiB!").

That actually makes it more impressive, since the GTA game world is the result of all these systems working together to create a "living, breathing world" indeed, versus what you see in most other open world games (most recently: CP2077).


> While there’s no question Carmack was an exceptional talent, let’s not forget the original Doom shipped with serious network killing bugs in 93

You're confusing "this is necessary" with "there are no (horrible) problems with this".

Doom needed low-level access because it was (at the time) simply impossible to make the game run at a reasonable speed without low-level access. Separately, Doom had bugs, because all software is terrible.

GTA does not need low-level access, because everything is has a legitimate reason to do runs at reasonable speeds on modern hardware (or if it doesn't, low-level access won't speed it up noticeably). Separately, GTA has bugs, because all software is terrible.


This isn't the 80s or 90s anymore. Back then, pretty much every somewhat advanced game asserted full control over the computer hardware as the first order of business. Operating systems neither had the means to prevent this, nor did they have the right kinds of drivers in most cases. With proper protected mode operating systems that include usable driver interfaces, this is thankfully a thing of the past.

Modern hardware and modern operating systems usually take care to ensure that any hardware acceleration features are exposed in a way that doesn't create security risks. And, unlike on consoles with known hardware, it doesn't make sense anymore to try to talk to PC hardware directly from a game because things like GPUs are so varied and have so much "secret sauce" these days that trying to create custom drivers is an exercise in futility.


> but games currently seem to require such low level hardware access that this is rendered impossible

That doesn't seem plausible considering that a lot of games run under linux with wine/proton, so clearly they don't need direct hardware access.


I don't think it follows that because they can run on linux/wine compat they don't require excessive privs.

Maybe low level hardware access is not the right term, versus excessive privileges that the hardware acceleration libraries require?


Hardware drivers that deal with hardware acceleration are written by hardware companies. Libraries and applications only call their functions as defined in the documentation. Game programmers in most cases have no clue about the low-level specifics of hardware that accelerate their games.


The problem is the cheats use kernel drivers so the anti cheat providers have to to keep up.

Edit: rather than downvoting can someone offer a reason as to why?


That is a cat-and-mouse game you cannot win, similar to DRM. There already are hardware cheating devices that use DMA to peek and poke main memory. Or cheats could run in a hypervisor or on vulnerable hardware in a SMM rootkit.

And if the response to that is ever more invasive surveillance technology then that's a security problem.


And the mistake people make is thinking it's a win or lose thing, it's not. It's about reducing the impact of cheaters and raising the bar high enough that an average Joe isn't willing to put up with the hassle needed to cheat. Just vexauwe you can't detect everyone doesn't mean you shouldn't ever try.


> It's about reducing the impact of cheaters

It's putting the collateral damage of these measures on the honest people to deter a few. These rootkits make everyone's systems less secure and less stable. Some even keep running after you close the game.

This is a hidden cost put on the consumers so that the game developers can profit without having to design a game that is safe when dealing with untrusted clients.


> it's putting the collateral damage of these measures on the honest people to deter a few.

It's putting the collateral on _everyone_ to deter a few who have a substantial impact. It's completely different to DRM where there is no knock on impact to other legit customers.

> Some even keep running after you close the game.

They all do. If they don't, then the cheat just needs to run first.

> game developers can profit without having to design a game that is safe when dealing with untrusted clients.

This isn't about profit, and thinking it is is pushing your agenda. On the most extreme side you can just pixel stream a rendered video feed, but the latency is awful for many kinds of games. You inevitably need to let the client have some sort of say (I shot at X), and it _will_ be abused.


> They all do. If they don't, then the cheat just needs to run first.

That is nonsense, otherwise the driver would have to be loaded at boot time, which in turn would mean installing a game would need a reboot which just isn't the case these days.

> It's completely different to DRM where there is no knock on impact to other legit customers.

The media industry claims that piracy cuts into their profit which would imply that it raises prices for everyone. So no, it very much is like DRM in many aspects.

> In the most extreme side you can just pixel stream a rendered video feed

Which is expensive since you now have to pay for the hardware instead of your customers paying for it, so that too is about profit. In a more extreme case you could put cloud gaming machines wherever CDNs put their edge acceleration boxes.

> You inevitably need to let the client have some sort of say (I shot at X), and it _will_ be abused.

No, not really. The client does not have to compute the canonical outcome of game mechanics, after all for any action they predict locally there might be an action taken by another player that is inconsistent with that prediction and they will only learn about that incompatibility once their lightcones intersect (which takes a few milliseconds). Something has to reconcile those, which might as well be the server rather than the client.

All a client has to do is to compute a tentative game state and the pixels that go along with it. That state will later be corrected once it learns about the canonical update, which can result in the infamous rubberbanding effect.

Most cheats that don't involve a broken game engine are of two flavors A) extracting information that the software has but the human shouldn't know (e.g. wallhacks) or B) having software perform inputs that the human should do (e.g. aimbots). A) Can be prevented by reducing the information the server sends to the client to the necessary amount to render their current view. B) cannot be prevented by any means as one could always hook up the output pixels of a GPU to an external computer and an emulated mouse via USB, this is analogous (heh) to the analog hole in DRM.

Neither of those have to do with the client determining "I shot at X" on its own.


> That is nonsense, otherwise the driver would have to be loaded at boot time, which in turn would mean installing a game would need a reboot which just isn't the case these days.

Riot Games' anti-cheat Vanguard (for LoL and Valorant) loads early during boot, requires a reboot after installing and if I recall correctly they now allow unloading it (because who wants to do banking with a rootkit loaded) but then require a reboot before playing again.


Ok, this is even more invasive than my worst experience and and also worse than some practices that already received backlash (such as persisting after the game closed, as previously mentioned). I wonder why game companies come to so different conclusions about the "need" for such things. Are their games easier to exploit and they just try to paper over it? Or is money involved, raising stakes?

If they go further we'll end up with closed, console-like systems (secure boot only with microsoft's key, only signed apps allowed). And of course DRM vendors would follow their footsteps.


> That is nonsense, otherwise the driver would have to be loaded at boot time,

This is already common - see valorant as an example.

> Which is expensive since you now have to pay for the hardware instead of your customers paying for it, so that too is about profit. I

No, it's about latency. See the absolutely vehement reaction to streaming services for twitch shooters on gaming forums.

> Something has to reconcile those, which might as well be the server rather than the client.

The problem here is that the server is some distance away from all the players. Waiting for a round trip from two players with server frame times could be up to 250ms, so for a better experience, some element of that is usually left up to the client; you can spin that either way, either the shooter gets the advantage, or the defender gets the advantage.

> Most cheats that don't involve a broken game engine are of two flavors

Most game engines are broken. The reality of the situation is that 1) this stuff is _really hard_, and 2) there are tradeoffs to be made at every single step of the process that affect how the game plays and feels. Moving processing to the server means latency, predicting and correcting server-side calculations means rubber banding. Neither of these things feel good in a twitch shooter.

> A) Can be prevented by reducing the information the server sends to the client to the necessary amount to render their current view.

Not just their current view, but everything the client needs to know about the next X ms before it expects its next server update (which might even be adjusted at a later point in time). And that state is just sitting there in memory, waiting to be (ab)used.

> B) cannot be prevented by any means as one could always hook up the output pixels of a GPU to an external computer and an emulated mouse via USB,

The number of people who are willing to do that is drastically smaller than the number of people willing to pay $130 a month for a rootkit that they willing install [0]. As I said in my previous comment, this isn't about eliminating cheating 100% - it's a cat and mouse game. It's about raising the barrier to entry from credit card to specialised hardware.

[0] https://www.skycheats.com/store/category/6-overwatch/

> Neither of those have to do with the client determining "I shot at X" on its own.

Even if you send inputs, and allow for a small amount of correction (because both clients have different representations of the game state at the same time), all it takes is one client to send "I actually did shoot at that guy 50ms ago".


> No, it's about latency. See the absolutely vehement reaction to streaming services for twitch shooters on gaming forums.

By putting trusted hardware close enough to the player you reduce the latency problem to a cost problem. This is more or less what game consoles currently happen do (of course the issue is that users still pay for it rather than the game company). Also, twitch shooters existed before rootkits. Choices made by game companies such as matchmaking with untrusted strangers probably exacerbated the problem over the years.

> Waiting for a round trip from two players with server frame times could be up to 250ms,

Only if they're living on different continents or have very high latency internet providers. Many games provide region-based servers for that reason.

And it's only a roundtrip that has to happen anyway. Clients stream their updates as they take actions, server reconciles them as they arrive and broadcasts the updates to all clients. This is the minimum amount of time it takes to get the information from player A to player B anyway unless you establish p2p connections between the clients and those happen to be lower latency than contacting the server.

> you can spin that either way, either the shooter gets the advantage, or the defender gets the advantage.

That doesn't matter, the point is that leaving reconciliation of some game actions to the client means clients can lie about something and claim that it was preempted by another action.

> Most game engines are broken. The reality of the situation is that 1) this stuff is _really hard_, and 2) there are tradeoffs to be made at every single step of the process that affect how the game plays and feels.

A problem being difficult is not a good justification for externalizing the costs, especially considering that hardware may be shared with other people and botnets that might use those weaknesses create further externalties.

> "I actually did shoot at that guy 50ms ago".

That would mean trusting the client's clock which quite obviously is something you shouldn't do. Arrival time should be the only thing that matters.

> The number of people who are willing to do that is drastically smaller than the number of people willing to pay $130 a month for a rootkit that they willing install [0]. As I said in my previous comment, this isn't about eliminating cheating 100% - it's a cat and mouse game. It's about raising the barrier to entry from credit card to specialised hardware.

That's only the tradeoff from the game dev perspective. It's totally ignoring the security or privacy implications of running these rootkits and the ever-escalating system restrictions that they demand. This is the crux of the argument. If we were only talking about game devs making tradeoffs between different game experiences, risk for themselves, profit and so on there wouldn't be a problem. But they're making a tradeoff with other assets that are not theirs.


It is about profit. They could simply not ship a root kit, and then presumably fewer people would buy the game because it would be known to have cheaters in it.


This is absurdly reductionist; they could also not ship a game, and less people wold buy it because it is known to not have shipped therefore not shipping the game is about profit, when it might be because the game itself isn't finished.


I honestly don’t understand what you are trying to say here. Not shipping a game would certainly result in less profit. These entities are profit motivated, so they make decisions to maximize profit. They decided shipping a game with a root kit is the best option available to them.


Because human moderation after an AI serverside filter is much more consistent and less desperate than solving the halting problem clientside 2 rings below your app's privilege level.


I do think you're correct, but one particularly sticky issue is what you do with all the subtle cheats. Rather than developing an exploit with wall-clip, radar, auto-headshots and so on - all obvious to a human moderator - what if you make one that just lowers recoil by a smidgen, 20% or so.

If you're watching the game as a moderator you could never tell that's what's happening. Yet -20% recoil, especially for a competitive esports player, would be such a massive advantage as to make that player the best in the world, but a way that looks quite legitimate.


Wouldnt that be trivially detectable in the data by comparing the nominal recoil configuration, reported hitscan trajectories, and mouse input profiles?

Especially since the cheats need to be available somewhere, so the developers should certainly be able to get their hands on them to test...


I assume that the argument is that pros can compensate recoil to some extent (i.e. the game doesn't just add increasingly random offsets to the trajectors but actually moves the crosshair and you can move it back if you're fast enough). If so then a cheat only has to simulate sufficiently realistic inputs based on the pixel outputs of the game, which is practically undetectable if you do it on external hardware. The only advantage the game devs would have is a larger amount of data they can use to determine what passes as realistic. And how are they going to identify false positives?


Not when the inputs are all randomized, it's nearly identical to a real human recoil control


Can you give an example of some games that take that approach that it works for?

I'd also suggest that human moderation is definitely less consistent, and definitely less able to scale.

Nobody is trying to solve the halting problem in the kernel either. If a cheat runs as a kernel level driver, and anticheat doesnt you've lost the battle already.


CSGO and Overwatch come to mind. I watch a talk on the latter, how the devs transitioned and optimised their anticheat system.

It makes sense the cheat I wrote would run as admin. It's my PC after all. I also don't have access to the server in any meaningful way and modify other people's data, either. It's a good trade-off. Devs can also write arbitrarily good statistical analyses for the game. It's their responsibility.

The question is simple - is this person performing abnormally? Well, games gather tons of telemetry nowadays, and the harder-to-detect cheats like wallhacks or fluorescent player textures are also the least useful. I am a hard silver I in CSGO. With or without aids, when I see someone run around the corner, I'll duck and magdump. Any gold nova will overpower me either way.


So, when cheats are implemented as rootkits/VM hosts, will that be required for the anticheat too?

I think at that point, it's turtles all the way down.


They already are rootkits and running in VMs. That's exactly why were talking about this. Most anticheat programs attempt to detect running in a VM host too and stop that..


It has always seemed like a flawed argument to me. The effort made into detecting running cheats requires the ability to monitor processes and give a level of access that has no place in a game.

How many computers have Valorant's anti-cheat installed on some computer it shouldn't be installed on, that suddenly gives the ability to decrypt and monitor tls traffic on the device, without detection?

It would make a lot more sense to detect cheating based on user input. Even stochastic cheats are possible to detect (where you say improve your aim in more subtle ways), yet, the most egregious auto-lock-on go on for years.

Kernel level anti-cheats are attractive, because you suddenly have a rootkit installed on millions of computers that you control. Monitoring the behaviour of them is also no good, because a targeted update can make it do whatever.


Literally no part of this comment is true. They’re not hidden and they’re not malicious.

They’re also very much required, because a user mode anti-cheat is trivially circumvented by a kernel mode cheat. The only way to defeat cheats is to wage continuous war with them, adapting to their tactics and stamping them out. It’s like collecting trash; trash will keep accumulating, your goal is to keep homes and streets clean in spite of that.


Every part of that comment was true. They are rootkits with a non-hidden persona, they are not malicious unless it's worth being malicious.

I don't give trash collectors access to my bank account so that they can better estimate the amount of plastic I'm about to produce and better prepare themselves to make streets even cleaner


That’s such a weird point when people to complain about bots and cheaters in games a lot. It almost killed Fall Guys because games stopped being fun.


> They’re also very much required

That's a weird notion of "required". Very few things are "required", notably food and oxygen. Saying "no" to bullshit like rootkits is very easy in comparison.


That's a pretty strong take. One could easily have read it as "required for the game" as in, the multiplayer experience drives the replay value of the game, ama prolific cheating destroys the community.

Of course, it still makes me avoid such games, but saying the argument is invalid because required only means air and water is absurd.


As a gamedev that develops online games I have to disagree that they are required.

Whatever type of anti-cheat you want, you can implement it at server-side with today’s hardware capabilities. However, that would increase server requirements and bandwidth considerably, and as companies don’t want to pay for that they install rootkits to people’s computers.


I’m going to be Bill Gates for a second and ask for your definition of “can”.

If you mean they can implement a fully server-side anti-cheat without compromising gameplay experience and without requiring the player to have a super-duper fiber connection with 5ms ping to the game server - no. They certainly can’t do it, not for fast-paced online shooters at least, not in this day and age.


While I agree with you to a large degree, from my limited perspective there is one problem with server side cheating detection, such detection would be inherently reactive (eg. it will take few cheating actions, before it is detected by service side anti-cheat) versus on client anti-cheat that is deliberate (eg. cheats can be detected before game is initialized, potentially)


And if cheating can’t be detected via statistical inference on the server inputs like texture hacks to see through walls?


You can actually raycast at server time and only send the location of players visible to client back. So wallhacks are definitely detectable on server side, in fact one of the games I’m making uses this system, it isn’t even that expensive.


Scenario: I step around a corner. There are enemies behind the corner which I didn’t see previously because they were hidden behind the corner for me.

Will I see enemies behind the corner instantly, or do I have to wait for round-trip time for the server to acknowledge that I stepped out of the corner and tell me where the enemies are?


Wall scale would have to be reduced around 10% less for visibility check on the server-side, so that player view info would be sent without RTT problems especially for the cases you mention.


This makes sense. However, what if we consider another scenario: a cheating player just staying behind the corner. He/she would have no problems receiving information about enemies behind that corner, so this would be much easier for the cheater to push the corner. Basically, the portion of the wall that you cut off for purposes of this visibility check can be considered transparent for cheaters, and opaque for legitimate players, putting legit players in disadvantage.

Can this happen in the game you make? If yes, do you consider this a vulnerability?


It definitely can happen, it is a vulnerability but an acceptable one:

1. There is no such thing as perfect anti-cheat solution with today’s resources. Even client side rootkit based solutions can be overcome.

2. My solution completely eliminates wall hacking at 90% of the cases, same can’t be said for the rootkit anti-cheat solutions.

3. Game can be designed to put corner campers to disadvantage. But honestly no need to think extremely deep about such cases, you would first release the game then check the player complaints about wall hackers. If it is a major issue then you can implement a double ray-cast solution for most reported players, use correct size wall for their ray-casts and ban them statistically. Even if you can ban players accidentally it wouldn’t be a huge portion of your player base.


Yes, but nonetheless, we either get a solution that affects gameplay experience, like enemies popping out with a delay, or that works only partially - in a variant with reduced walls, in 100% of cases, a cheater will have a bit more time to prepare for what's around the corner.

It is true that there is no silver bullet. Anti-cheating solutions are all about compromises. Ring-0 anti-cheats are just another compromise that enable anti-cheat developers to scan for cheats efficiently, with some risk of privacy abuse added (and not as big risk as some may think - I believe Microsoft would revoke driver signatures from any anti-cheat developer that tried doing shady stuff, after which the developer would go bankrupt, most likely). This is not a compromise that most of HN would take, but I don't see a problem if users are well-informed about the risks.

False-positive bans of legitimate players can become a legal liability, especially for users that paid real money for the game.


This doesn't 100% fix the vulnerability. If your game has footstep sound effects the hack can detect the location of the audio sources playing footstep clips.

(I agree though that low-level anticheat also doesn't 100% solve cheating, and causes lots of problems for legitimate players. Just that cheating is always going to be a game of cat and mouse)


> the hack can detect the location of the audio sources playing footstep clips.

You don't need a hack for this; the whole point[0] of having footstep sound effects is that the player can detect the location of the audio sources playing footstep clips.

Edit: 0: well, and ambiance, but if your ambiance is impacting your tactical considerations, that's a game design issue in and of itself.


You also add a report system and do additional inference based on reports, add a system where random player review matches. When your multiplayer games become very popular and making you money, you have to put the effort in, not install rootkits on paying customers computers.


Or just accept the fact that you'll never completely stop people cheating? Have your server prevent obvious cheaters but maybe people should just accept that the only way to avoid people cheating against them is to only play with people they trust


That completely destroys a whole class of games. How do you suggest I get 60 of my friends together at once for a game of Apex Legends?

You'll never completely stop people cheating, but that doesn't mean that game developers should just give up. Thankfully the people clamouring for less, rather than more, anticheat is relegated to a vocal minority.


When I was younger, we used to play on community servers with active moderation. That worked pretty well.


Active moderation is still a thing now, people are just actively employed by the game companies to do that job. The cheating situation in multiplayer FPSs is generally much better now than it was back in the day I was a mod for CS:S servers.


Cool, young people today prefer more competitive environments.


When one of your 60 friends refuses to stop cheating, maybe stop inviting him to the party. Lots of games and lots of circles of gaming friends get on just fine without kernel-mode anti-cheat.


You missed the point. The idea of coordinating games between 60 trusted people is ridiculous. Anti-cheat and matchmaking between untrusted players is essential in modern (FPS) gaming.


That wipes out any trust in public rankings and some very big games are entirely built around those. Not going to happen while competition is a major income source.


Then why do we still see cheats in these games? In this day and age, surely they could just use ML and human supervised cheat detection teams, and set up the incentives to make account bans more painful.

Instead, we get security theater and everything must be F2P to get the maximum number of people in the door.


Valve is working on using ML for anti-cheat but it's still quite a way from being ready.

https://win.gg/news/8115/is-ai-the-future-of-csgo's-anti-che...


It's "ready" in the sense that it runs, analyzes every single shot in every single game played on valve matchmaking servers, and submits the suspicious ones it finds to the player-controlled overwatch system. At John McDonald's GDC talk he showed them achieving a 98%+ success rate, which while not enough to start banning people outright since any false positives are too many, is super promising.


If I had to take a bet I would bet it will never work.


how about

1) if you want a challenging game sign into normal game server

2) i you want a cheat game sign into cheat server

3) if you cheat on a normal server, you are banned for life no questions asked from ever using a non cheat game server


> banned for life no questions asked

If a 13-year-old cheats at a game, do you really think they should still be banned when they're 30? Permanent punishments for things people did as minors are essentially unheard of for anything less serious than murder or rape.


HVH servers exist for CSGO, which are hack vs hack. It's a great alternative. The problem is Steam accounts can be generated en masse and CSGO is free.

Or accounts with the Prime upgrade that makes cheaters less common are often stolen ("cracked primes") and sold to cheaters for a few bucks.


Why can't the user mode anticheat just check if any unsigned kernel mode driver's are active?

In any case, I'd argued the integrity of the OS is far more important than a game's anticheat. Game devs could ship kernel mode anticheat but user's should be at least more aware that they giving full control over to these programs.


Sure. Give me the denuvo anti-tamper and anti-cheat source code, and then we can negotiate.


> They’re also very much required,

The games I play don't have these sort of anti-cheat features; the anti-cheat is handled entirely on the server side. A server admin who is attentive and ready to rollback griefs helps a lot too. But kernel-mode anticheat on the players' computers is not "very much required."

(The game I play most is minecraft, and I think it's more popular than any of the games with this sort of invasive anticheat.)


Microsoft has refrained from attributing this incident to nation-state actors just yet.

They can always blame "accidental human error."

https://www.bbc.com/news/world-asia-57367100


If we think rationally, why would a nation state put in effort to block ONE single search term to hinder information spread while still letting multiple other queries give the same results? There is zero to gain from that.


Will we admit yet that having OS vendors sign software does nothing to stop malware?


As with all security mitigations it does something, it’s significantly more difficult to get kernel level malware installed now than in the Win95 days, but it’s not a cast iron guarantee nothing bad will ever happen again.


Signing is a useful measure. But not by itself. There are several harder admissions to be made.

The security business is very lucrative ambulance-chasing. A business-grade OS needs high-confidence evaluation and design.

mWindows cannot be safe while being all things to all users, with backward-compatability extending three decades. It may be time to split the product into more than just artificial marketing tiers.

Rewrite the OS in a safer language. I won't pick one, but Microsoft is large and sufficiently profitable to know what to do and how to do it. mWindows 11 should not just be a change of curtains and doilies.


I think it doesn't stop malware completely, that's impossible, but if you look at the Apple ecosystem you can see that it does help, a lot.


It barely helps at all (almost all your apps are pulling in telemtry/auth libraries from data brokers regardless of the permissions you give them) and the cost (no more personal computing.) is incredible.


It's a significant hurdle, especially if getting something signed requires some kind of certification process and company identity verification.

It also ensures that the OS vendor has a copy of the binary (although it will only be the first stage, I assume). Without signing, attackers can push malware onto one machine without anyone else getting a copy.


The driver is signed with the certificate of the developer, an EV certificate at that. The “Microsoft signature” is just an attestation signature and does not indicate the software is from Microsoft.

Every driver that runs on Windows 7+ is required to be signed this way via the Microsoft portal.

Why the article seems to go out of its way to not mention the name of the certificate used to sign the driver first is strange.

They also fail to mention that this cross signature gives Microsoft the power to revoke the validity of the driver or every driver signed with the developers certificate.

Again this may not have fit into the crafted narrative they are providing.


The issue it seems is that MS dropped the requirement that drivers must also be signed by the developer’s own EV certificate - it seems developers can get binaries signed without that, which means only MS knows which company is the one that signed a specific program/driver in this configuration. https://twitter.com/gossithedog/status/1405976566694395906?s...


Wow that’s a huge WTF on Microsoft’s part. Why drop the only component of signing that actually identifies the author?!


The details in the article are much worse than the headline suggests.


Where does the driver come from, in the supply chain? I mean, is it the game devs that put it in? Is it the anti-cheat software they buy from a vendor? If so, who develops that and how did the driver get into it? It would be good to know at what stage it was injected.


What's the reasoning behind not disclosing the list of games affected?


Sounds similar to the AMD "PCI Driver": https://news.ycombinator.com/item?id=27171520

Well, perhaps for less shady purposes, but still very misleading.


So what are the legal ramifications? Because that's really the only interesting thing here? (A list of affected products would also be good, but that's not interesting, that'd kind of the bare minimum)


Edit: after getting in touch with Joshua, behind Echo, the original evidence on Twitter by Kevin Beaumont has been retracted [0] and Joshua claims that Echo is not related to the rootkit in question.

Edit 2: Some further clarifications about Echo from Joshua: "We also don't just give untethered access, it's a scan which does specific things and doesn't have any malicious capabilities from the staff member on the player. How it works is cheats leave traces behind, Echo is almost a forensic analysis to see if cheats were on that PC. For example, any executable on windows that resolves a domain has strings in the Dnscache service, or any process that makes requests to cheat servers will leave the domain as strings in lsass.exe (which is can be PPL and can require kernel to read from). We use the driver and strong obfuscation to prevent cheat developers (known to be good reverse engineers) from being able to just clear these traces."

[0] https://twitter.com/GossiTheDog/status/1408900596145569795

Piecing together info scattered across disclosures [1] and tweets [2], it seems the malware is related to Echo [3], a product marketed for Minecraft server owners that allegedly helps in detecting cheaters. From what I understand, suspected cheaters are requested by the server staff to download and install the Echo client, which shares the player's screen [4] with the admins and apparently gives full, unrestricted OS access.

I've tried to get in touch with the people allegedly behind Echo [5] [6] for clarification, and I'll update this once I get more info.

[1] https://www.gdatasoftware.com/blog/microsoft-signed-a-malici...

[2] https://twitter.com/GossiTheDog/status/1407328596247646217

[3] https://echo.ac

[4] https://www.reddit.com/r/screensharing/

[5] https://find-and-update.company-information.service.gov.uk/c...

[6] https://discord.gg/mGTTAT5


>From what I understand, suspected cheaters are requested by the server staff to download and install the Echo client, which shares the player's screen [4] with the admins and apparently gives full, unrestricted OS access.

Downloading and running software like that from someone you met on a game server is incredibly stupid. I know multiple people who have gotten rootkits and lost control of important accounts this way.


Hi swily, I own Echo. This is a common misconception about our software, we're an established and still growing company. There are indeed some bad eggs in the community, however we are trusted and our tool has absolutely no malicious capabilities from the server staff on the player. Every mainstream game has a powerful anti cheat performing kernel functionality just like ours. If someone is asked to download and scan with Echo, firstly since we're established and it's better than being asked to download "Swiley Tool", secondly they always have the option to say no and it's absolutely not forced.


I'm touched that you cared enough to reply to my comment and it is nice that it's separate from the game, that's definitely an improvement over the status quo for AAA closed source video games. I don't mean anything against you personally and am sure you and your team are wonderful and intelligent people but:

1) popularity alone does not justify misbehavior (especially distributing what are essentially rootkits.)

2) mature communities can function without cheating. I often play a game called Xonotic which has no such functionality and have yet to meet someone cheating at it.

3) I would always elect 100% of the time to abandon any game or server before running any kind of software like this.


I completely understand your concerns, however I have a few points that might change your thoughts on this:

- Almost all competitive games nowadays have a powerful client side anticheat which is constantly monitoring processes on a kernel level to combat kernel cheats. They use a lot of the same functionalities that Echo does, such as protecting memory. What makes our trusted company different from another trusted company such as Epic Games and their "easy anti-cheat"? The only difference is we're doing these checks using a scanner interface, with absolutely no extra functionality which could be used maliciously by the staff member.

When you install Valorant, using this logic, they are "essentially installing rootkits". But they're not rootkits, since it's authorized and an automatic process.

- Once the 30 second scan has completed, there is absolutely nothing left on the computer whatsoever. Echo is a single binary, on launch it extracts the .sys kernel driver, creates a service. It then uses that kernel driver throughout the scan. Then on exit, it deletes the service, deletes the .sys and all that is left is the original binary you downloaded. It leaves absolutely nothing left behind, and it does this automatically. With client side anti-cheats which large game companies use, they often require it to be launched on start-up (it may say "you need to restart your computer before playing), this has even more malicious potential.

- Everything has malicious potential as soon as you click "Yes" on the UAC menu, trust in the companies is a huge huge part of keeping yourself safe. What if Facebook shares all your personal data randomly? They won't. What if Valorant starts doing malicious stuff with their kernel capabilities? They won't, and neither will we.

A computer forensic analysis of memory to look for cheats after they've been deleted is not as effective as a client side anti-cheat would be, however for Minecraft it's much easier because it's difficult to make cheats for it without leaving traces behind. But, it's the same level as risk being asked to use a client side anti-cheat to play on a server than being asked to scan with Echo if the server side anti-cheats show indications.


>Almost all competitive games nowadays have a powerful client side anticheat which is constantly monitoring processes on a kernel level to combat kernel cheats.

That's totally ok if the anticheat admins are providing a computer to run the software on and not attacking one the player owns. Again "almost everyone else does it" doesn't make it ok, I don't know why people keep repeating that.

>absolutely nothing left on the computer

That's nice that it doesn't leave a mess, copying tons of data out of the machine from ring0 probably exposes people to a lot of liability though, that whole thing sounds like a terrible idea but it's still better than the status quo so props to you for clearing that bar.

>- Everything has malicious potential as soon as you click "Yes"

Yes. That's my point.

>What if Facebook shares all your personal data randomly? They won't.

Well, it's not random, usually you have to pay for it except on some pretty bad days. That's a great example of why I would universally reject something like this though. You pretty much can never trust someone handing you closed blobs.

I'm honestly pretty shocked anyone tolerates this, it sounds pretty similar to random cavity searches when you leave wallmart. Maybe because I'm not a "kid" anymore I wouldn't know what the social scene looks like.


> That's totally ok if the anticheat admins are providing a computer to run the software on and not attacking one the player owns. Again "almost everyone else does it" doesn't make it ok, I don't know why people keep repeating that.

What do you mean by this? You realize almost every competitive ranked game right now has a kernel driver constantly running and monitoring in the background while you play right? Anti-cheats serve a purpose to detect cheats, just like an anti-virus serves a purpose to detect viruses.

You seem oblivious to the fact that there is no extra security risk from a client sided scan versus (like with Echo) having a client side anti cheat running while you play (like every game). What point are you trying to make?

You keep saying it's a terrible idea but haven't said what's wrong with it...

It doesn't copy tons of data out the machine through ring0 and it doesn't expose anyone to anything whatsoever, its almost all on machine and the same goes for most anti-cheats.


>Yes. That's my point.

I agree with your point, even chrome has the power to destroy your PC if they really wanted to (??).


I suppose that in the MC community, there is a certain level of trust to "SS Tools". Since there aren't "that many" SS tools. I also think the culture is different in competitive minecraft, since whenever I get SSed I usually see a pattern: "They use Anydesk", "They use either Echo, Paladin, Avenge or Actova" and "They usually do dome manual stuff in process hacker" and this usually makes me feel more comfortable. Its ok to not wanna get SS'ed in the community, but this usually comes with a punishment, like a permanent ban, or a temporary one. The SS'ing process in the MC community is more of a "Last chance to prove you are legit before your cheating ban" rather than a "I SS you just randomly".

In Echo's case they have taken action by now taking logs and looking at the custom themes where people used images such as "launching booter" and "Injecting rootkit" and are now handing out punishments to users who do that. ( Reference to Joshyer's announcement: https://media.discordapp.net/attachments/852881172152844318/... )

I would also understand why people would be confused about what a program such as Echo does, Its not a remote access tool, and it does not provide any access for anyone after the fact. Here is how it works if you are confused: I tell the person to send me their Anydesk code, then I go to the panel of echo and generate a pin, on the person in question I run the tool and enter the pin, the tool does some magic, removes itself from the pc and then i can see all the info I need like Program start times, Ran and Deleted files, internal strings, etc. Without ever leaving anything on the users PC, except possibly a log file. After the fact I have no access to the users PC or data not necessary for scanning, and the program has been deleted.

Addressing your points:

1) Of course it doesn't, but again, Its not Echos fault that some of the custom themes were posted, and some of their competitors dont exactly have squeaky clean records either, But the fact that echo removes itself, makes it either seem legit, or like the worlds worst virus (Canadian Virus; "Woops didnt mean to infect your PC, ill just remove myself, just carry on.").

2) Welcome to minecraft, a game where you download a zip file, run a .bat and boom you have the source code. Minecraft offers no internal anticheat so therefore using a solution like Badlion's BAC or lunars (No longer existing) Lunar Online mode can help since they catch cheaters by running in the background while the game runs, looking at DLLs and inputs to determine if a person is using a advanced Autoclicker, or a bad cheat like VL. Minecraft has plenty of serversided anticheats since creating Plugins for servers is even easier than creating a modded client. The issue with the game is that it runs at 20 TPS, meaning that you only have 20 samples per second to deal with, this can lead to major issues and some things just being near impossible to detect without a client sided component. And i can tell you, on minecraft servers (especially Practice, Kitpvp, Kitmap, HCF and SMP) Cheaters make anywhere from 5 - 50% of the playerbase. Where people use anything from mild 8 CPS autoclickers, to TPAura, Fly, Killaura, etc.

3) Well again, the culture is different, Some people decide to click the "Quit to menu" button, indicating that they would rather have the ban instead of proving they are legit, or having the embarrassment of being proven a cheater and having the slight public humiliation of the "PLAYERNAME has been removed from the server for "Cheats found in SS"" message getting sent to everyone on the network. Then there are the people who either think their cheats bypass, or who are legit and who aren't afraid to show whoever runs the SS Tools. So again, the culture around SS Tools in minecraft is different than, lets say Xonotic.

Hope this helps clear up a bunch of stuff. (Wrote more than for my final exam right here lmao)


> minecraft, a game where you download a zip file, run a .bat and boom you have the source code.

No. You have a high level transliteration of the bytecode. Xonotic is in fact distributed in source form so it's easier to cheat with.


Never heard of Xonotic before, but a quick look at the website, its says: "Features such as... ...a functioning anticheat system", While I cannot vouch for if it works (sounds like it since you have never met a cheater) its better than Minecraft where there is no anticheat and only runs at 20 TPS, which is why the SS tool market even exists in minecraft, Games with a cheater problem or where a bunch of things are handled client side (Like Minecraft) there's gonna be external anticheat solutions. CSGO is another example, since VAC bypasses are widespread solutions like FACEIT and ESEA have been made, these are all external solutions to prevent cheaters. Minecraft is also Java meaning its very easy to decompile yourself, it only has minor obfuscation aswell, so yeah its not source, but its as good as source. (http://www.modcoderpack.com/)


As far as I can tell they're just doing statistical analysis on data that's already available to the server.


Wait a second. According to the article, the only weird thing the driver does is sending some CPU info telemetry and automatically checking for updates. By that standard, nearly every piece of gaming software that releases nowadays is malware. "Ningbo Zhuo Zhi Innovation Network Technology" is also not on the current DOD list of Communist Chinese military companies. The whois record for the given ip doesn't even link to this company, it's a protected record under some Chinese telecorp. Who wrote this garbage? This seems like a cheap attempt to generate outrage.


This malware registers system wide proxy and redirects certain traffic on IP level. It also registers it's own CA certificate making TLS MITM possible. Finally it has auto-update functionality which could be used to push different malware to selected users.


>every piece of gaming software that releases nowadays is malware

Yes.


A kernel driver can patch anything on your system. It's not the same as an auto-updating game.


I'm not talking about games. I was referring to the typical gaming software in form of GPU or input hardware drivers. Nvidia for example bundles a shitton of telemetry services with their drivers, as does logitech with their keyboard drivers.


Looks like this is a booter and it hooks into network traffic https://twitter.com/gossithedog/status/1407328596247646217?s...


> Notably, the C2 IP 110.42.4.180 that the malicious Netfilter driver connects to belonged to Ningbo Zhuo Zhi Innovation Network Technology Co., Ltd, according to WHOIS records.

> The U.S. Department of Defense (DoD) has previously marked this organization as a "Communist Chinese military company," another researcher @cowonaut observed.

Tried to check the claim, which links to http://www.defense.gov/Newsroom/Releases/Release/Article/247.... There are four short PDFs linked there, and I couldn't find the company in any of the lists. Tried to check historical versions on Internet Archive, still couldn't find anything, but admittedly I only looked briefly. "site:defense.gov Ningbo Zhuo Zhi" turned up nothing on Google.

Now, WHOIS record points to nbgaofang.com, which claims to be a cloud provider specializing in DDoS protection, so a Cloudflare of sort.

Do reporters actually read what they link? Do they intentionally report “juicy” bs with sham links that don’t support the bs, knowing full well that few readers would try to verify sources?


These are the US trademarks for "Ningbo Zhuozhi Innovation Network Technology Co.,Ltd." (one less space):

* https://uspto.report/company/Ningbo-Zhuozhi-Innovation-Netwo...

There may be something lost in transliteration, or the DoD may have listed the parent (holding) company.


Sure, there’s nothing for one less space either, or anything close.

> or the DoD may have listed the parent (holding) company

Like which one? I think it’s the reporter’s job to find out. Posting a link implying something’s there when there’s actually nothing is garbage reporting at best and malicious at worst.

Here's a more likely explanation though. The "security researcher" quoted saw AS56041 China Mobile communications corporation in WHOIS, which is actually listed by DoD, rather than the aforementioned company. However, that's just the backbone provider; attributing based on that is kind of like attributing a U.S. C2 server to Hurricane Electric. The "security researcher" should probably stay away from the attribution game if they can't recognize backbone providers.


> Like which one? I think it’s the reporter’s job to find out.

I don't disagree. If you feel strongly about it:

* https://www.bleepingcomputer.com/author/ax-sharma/


This article is certainly low quality, it rehashes existing media stereotypes (something bad happened, and naughty China is behind it) instead of making things clear. Obvious questions (What game or other software bundled the driver? Was it made by the same company that signed the code? What runs on the servers from the redirected addresses list?) are not even mentioned. If the narrative is that Chinese hackers tricked the impregnable systems of Microsoft, then it falls apart after learning, say, how many times official drivers from Intel and the likes were found to have privilege escalation functionality. Also, we are supposed to be happy when it's official western corporate rootkits that collect all system information they want, and share it with who knows which organizations, or when Microsoft itself does it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: