Hacker Newsnew | past | comments | ask | show | jobs | submit | nwallin's commentslogin

They're called NPUs and all recent CPUs from Intel, AMD, or Apple have them. They're actually reasonably power efficient. All flagship smartphones have them, as well as several models down the line as well.

IIRC linux drivers are pretty far behind, because no one who works on linux stuff is particularly interested in running personal info like screenshots or mic captures through a model and uploading the telemetry. While in general I get annoyed when my drivers suck, in this particular case I don't care.


In Southern California it costs $120 just for a guy to come out and look at your HVAC. Not fix anything--not install anything--just to look at it and give you an estimate for how much the repair is going to take. I went to the website for a local installer and they give a ballpark of $13,000-$25,000 for a heat pump installation.

I don't know why it's so expensive here. It shouldn't be, it makes no sense. But it is.


No he meant 20kmAh. It's rated for a 5km hike, delivering 2 amps the whole time, taking 2 hours. 5*2*2 = 20.

(I wish batteries were just rated in Joules)


> The only valid battery capacity unit of measure is watt hours.

I would also accept Joules. But yes, the unit should be a unit of energy.


You are probably making a joke, but just in case and for those that don't know, a joule _is_ a watt hour.

1 joule is 1 watt-second to be precise. So 1Wh is 3600 joules.


So that's why it doesn't align with existing industry units and always requiring conversions...


Thank goodness SI units are power-of-ten based so converting between watt hours and joules is just a matter of moving the decimal place. Oh, and throwing in an ancient Sumerian constant approximating the number of Earth rotations as it revolves around the sun once.


No, Watt-second-hour = 3.6 kJ, so J to Wh is moving the decimal place couple steps AND dividing by 3.6. The actual units used in circuit designs is mAh, so the decimal has to be moved for another time then divided by 3.7[V] again. That's too much for a smooth-brained man like I am.


Do seconds have anything to do with the Earth rotating around the sun? I thought a second just has to do with the Earth’s rotation on its axis.

Also, I wonder how usable a unit of time that was not based on a day would be, since so much of our life revolves around that cycle.


Yes, seconds are related to the Earth rotating around the sun. Simplifying slightly, the normal definition of a day relates to how long for Earth to rotate on its axis until the same spot on the Earth points at the sun again.

Compare Mean solar day vs Stellar day vs Sideral day - the difference is less than 5 minutes or so.


Thanks, I see now from:

https://en.wikipedia.org/wiki/Solar_time

I was thinking along the lines of the ancient Sumerians arbitrarily deciding to divide 1 day into 24 hours, and 24 hours into 60 minutes, and 60 minutes into 60 seconds, and how that doesn't have anything to do with how humans came up with the concept of 1 year (the Earth rotating around the sun).


I prefer to use British Thermal Units (BTU) for my battery capacity.


I use calories… then I can plan my meals and electronics together.


I know it's a matter of taste but personally I prefer to use the Calorie instead of the calorie ;)

(Context: 1 Calorie = 1 kilocalorie = 1000 (gram) calories; https://en.wikipedia.org/wiki/Calorie)


> 1000 (gram) calories

The "(gram)" make no sense here. We commonly use "kilo" as shorthand for kilogram", but kilo is just a prefix indicating 1000 and never indicates "kilogram" when given as a prefix to another unit, and so there's no implied/left out "gram" in 1 kilocalorie.


1 kilogram Calorie is the amount of energy needed to raise the temperature of one kilogram of water by 1 degree. 1 gram calorie is the amount of energy needed to raise the temperature of 1 gram of water by 1 degree.


I see there's some use of it after having done some searches, so I'll concede it makes some minor sense as a means to disambiguate due to the Calorie/calorie confusion. Especially as "calorie" and "gram calorie" then means the same thing. This is actually the first time I can recall having seen anyone use it, though, and so for me at least it confused matters rather than clear it up...


Today I learned there are multiple BTU definitions. https://en.wikipedia.org/wiki/British_thermal_unit lists Thermochemical, 59 °F, 60 °F, 39 °F, and International Steam Table.

Though as the difference is at most 0.5%, it's probably won't affect your battery buying experience. :)

Measuring by TNT equivalent is more standardized. "This battery stores 50 grams of TNT."

Ummm, on second thought, maybe don't use that term at the airport, .. or in secure areas, ... or near the police, ... or in public, ... or on social media or anything else tapped by the NSA or other authorities.


We could also talk about lb•AU (pound Astronomical Units), but generally it's best to stick to what's standard so readers don't need to do conversions. Watt hours is great.


It's not terrible... The iPhone 17 has a battery capacity of 63 nano lb·AU. Around 16 million would equal 1 lb·AU.

Another fun one would be milli hundredweight leauge (mcwt·lg). Both hundredweight and league have multiple accepted definitions to make it more "fun". But the range maps quite nicely to everyday things:

AA battery - Around 5 mcwt·lg

Phone battery - Around 20 mcwt·lg

Laptop battery - Around 200 mcwt·lg

Car 12v battery - Around 1,000 mcwt·lg

EV battery - Around 100,000 mcwt·lg


One of the demos was printing a thing out, but the processor was hopelessly too slow to perform the actual print job. So they hand unrolled all the code to get it down from something like a 30 minute print job to a 30 second print job.

I think at this point it should be expected that every publicly facing demo (and most internal ones) are staged.


I've thankfully never had my house robbed, or a cell phone or laptop stolen. I have had my car broken into. The thieves chucked a paving stone through the window, grabbed a backpack sitting on the passenger's seat, and ran off with it. Left the paving stone in the driver's seat. The backpack had my gym clothes in it. A T-shirt I was rather fond of, a pair of shorts, a few extra pairs of socks, and a shitty pair of sneakers, all were well worn.

Replacing the backpack and gym clothes was probably $100, market value was maybe $10, and it was $507 to fix the window. (my deductible was $500.)


I thought you were going to say "but they ignored the $100 textbook on the dashboard" or something. The anecdote doesn't demonstrate anything. How much of an inconvenience the theft was for you is not a factor for the thief. They got $10 by chucking a rock through a window, and they only lost the opportunity cost of choosing a different victim.


They had to take the cumulative risk of getting caught though - one well-targeted burglary to take a designer handbag or diamond necklace would earn that thief as much as the indiscriminate 'stealing nwallin's gym clothes' thief would make in a year, as long as they had the network to sell the contraband on without incriminating themselves.


That risk is there regardless of what they steal. The kind of thieves who break into cars are low-effort-random-reward. They have neither the patience nor the skill nor the resources for the kind of planning you're referring to. Yes, the bag didn't contain much valuable. A different bag might have. Had the thief known that for a fact beforehand they probably wouldn't have bothered.

Outside nwallin's car: no valuables

Inside nwallin's car: maybe valuables?


There is no risk in may states like California:


I had my apartment broken in at one point many years ago, and the thief basically only bothered to take my MacBook Air. Nothing else was missing.


> With a copyright, people are allowed to do anything similar to you, so long as they do not derive their work from yours.

John C. Fogerty famously got sued by John C. Fogerty for sounding too similar to John C. Fogerty.

https://blogs.law.gwu.edu/mcir/case/fantasy-v-fogerty/


> The universe [...] doesn't require infinite precision,

Doesn't it though?

What happens when three bodies in a gravitationally bound system orbit each other? Our computers can't precisely compute their interaction because our computers have limited precision and discrete timesteps. Even when we discard such complicated things as relativity, what with its Lorentz factors and whatnot.

Nature can perfectly compute their interactions because it has smooth time and infinite precision.


> Nature can perfectly compute their interactions because it has smooth time and infinite precision

That doesn't follow. Nature can perfectly compute them because they are nature. Nowhere is it required to have infinite precision, spatial or temporal.


Accept-reject methods are nonstarters when the architecture makes branching excessively expensive, specifically SIMD and GPU, which is one of the domains where generating random points on a sphere is particularly useful.

The Box-Muller transform is slow because it requires log, sqrt, sin, and cos. Depending on your needs, you can approximate all of these.

log2 can be easily approximated using fast inverse square root tricks:

    constexpr float fast_approx_log2(float x) {
      x = std::bit_cast<int, float>(x);
      constexpr float a = 1.0f / (1 << 23);
      x *= a;
      x -= 127.0f;
      return x;
    }
(conveniently, this also negates the need to ensure your input is not zero)

sqrt is pretty fast; turn `-ffast-math` on. (this is already the default on GPUs) (remember that you're normalizing the resultant vector, so add this to the mag_sqr before square rooting it)

The slow part of sin/cos is precise range reduction. We don't need that. The input to sin/cos Box-Muller is by construction in the range [0,2pi]. Range reduction is a no-op.

For my particular niche, these approximations and the resulting biases are justified. YMMV. When I last looked at it, the fast log2 gave a bunch of linearities where you wanted it to be smooth, however across multiple dimensions these linearities seemed to cancel out.


fastmath is absolutely not the default on any GPU compiler I have worked with (including the one I wrote).

If you want fast sqrt (or more generally, if you care at all about not getting garbage), I would recommend using an explicit approx sqrt function in your programming language rather than turning on fastmath.


I've read about the fast inverse square root trick, but it didn't occur to me that it can be used for other formulas or operations. Is this a common trick in DSP/GPU-like architectures nowadays?

And what's the mathematical basis? (that is, is this technique formalized anywhere?)

It seems insane to me that you run Newton's algorithm straight on the IEEE 754 format bits and it works, what with the exponent in excess coding and so on


1/sqrt(x) is complicated. Imagine instead of computing 1/sqrt(x), imagine instead that you wanted to compute exp_2(-.5 log_2(x)). Also imagine you have an ultra fast way to compute exp_2 and log_2. If you have an ultra fast way to compute exp_2 and log_2, then exp_2(-.5 log_2(x)) is gonna be fast to compute.

It turns out you do have an ultra fast way to compute log_2: you bitcast a float to an integer, and then twiddle some bits. The first 8 bits (after the sign bit, which is obviously zero because we're assuming our input is positive) or whatever are the exponent, and the trailing 23 bits are a linear interpolation between 2^n and 2^(n+1) or whatever. exp_2 is the same but in reverse.

You can simply convert the integer to floating point, multiply by -.5, then convert back to integer. But multiplying -.5 by x can be applied to a floating point operating directly on its bits, but it's more complicated. You'll need to do some arithmetic, and some magic numbers.

So you're bitcasting to an integer, twiddling some bits, twiddling some bits, twiddling some bits, twiddling some bits, and bitcasting to a float. It turns out that all the bit twiddling simplifies if you do all the legwork, but that's beyond the scope of this post.

So there you go. You've computed exp_2(-.5 log_2 x). You're done. Now you need to figure out how to apply that knowledge to the inverse square root.

It just so happens that 1/sqrt(x) and exp(-.5 log x) are the same function. exp(-.5 log x) = exp(log(x^-.5)) = x^-.5 = 1/sqrt(x).

Any function where the hard parts are computing log_2 or exp_2 can be accelerated this way. For instance, x^y is just exp_2(y log_2 x).

Note that in fast inverse square root, you're not doing Newton's method on the integer part, you're doing it on the floating point part. Newton's method doesn't need to be done at all, it just makes the final result more accurate.

Here's a blog here that gets into the nitty gritty of how and why it works, and a formula to compute the magic numbers: https://h14s.p5r.org/2012/09/0x5f3759df.html


You can use the same techniques as fast inverse sqrt anywhere logs are useful. It's not particularly common these days because it's slower than a dedicated instruction and there are few situations where the application is both bottlenecked by numerical code and is willing to tolerate the accuracy issues. A pretty good writeup on how fast inverse sqrt works was done here: https://github.com/francisrstokes/githublog/blob/main/2024%2...

A lot of old-school algorithms like CORDIC went the same way.

There's a related technique to compute exponentials with FMA that's somewhat more useful in ML (e.g. softmax), but it has similar precision issues and activation functions are so fast relative to matmul that it's not usually worth it.


The range reduction of sin/cos is slow only because of the stupid leftover from the 19th century that is the measuring of the phases and plane angles in radians.

A much better unit for phase and plane angle is the cycle, where range reduction becomes exact and very fast.

The radian had advantages for symbolic computation done with pen and paper, by omitting a constant multiplicative factor in the derivation and integration formulae.

However, even for numeric computations done with pen and paper, the radian was inconvenient so in the 19th century the sexagesimal degrees and the cycles continued to be used in parallel with the radians.

Since the development of automatic computers there has remained no reason whatsoever to use the radian for anything. Radians are never used in input sensors or output actuators, because that can be done only with low accuracy, but in physical inputs and outputs angles are always measured in fractions of a cycle. Computing derivatives or integrals happens much more seldom than other trigonometric function evaluations. Moreover, the functions that are derived or integrated almost always have another multiplicative factor in their argument, e.g. frequency or wavenumber, so that factor will also appear in the derivation/integration formula and the extra multiplication with 2Pi that appears in the derivative (or with 1/2Pi that appears in the integral) can usually be absorbed in that factor, or in any case it can be done only once for a great number of function evaluations. Therefore switching the phase unit of measurement from radian to cycle greatly reduces the amount of required computation, while also increasing the accuracy of the results.

A mathematical library should implement only the trigonometric functions of 2Pi*x, and their inverses, which suffice for all applications. There is no need for the cos and sin functions with arguments in radians, which are provided by all standard libraries now.

For reasons that are completely mysterious for me, the C standard and the IEEE standard of FP arithmetic have been updated to include the trigonometric functions of Pi*x, not the functions of 2Pi*x.

It is completely beyond my power of comprehension why this has happened. All the existing applications want to measure the phases in cycles, not in half-cycles. At most there are some applications where measuring the phases or plane angles in right angles could be more convenient, i.e. those might like trigonometric functions of (Pi/2)*x, but again not even in those cases there is any use for half-cycles.

So with the functions of the updated math libraries, e.g. cospi and sinpi, you hopefully can avoid the slow range reduction, but you still have to add a superfluous scaling by 2, due to the bad function definitions.

Similar considerations apply to the use of binary logarithms and exponentials instead of the slower and less accurate hyperbolic (a.k.a. natural) logarithms and exponentials.


Another datapoint that supports your argument is the Grand Theft Auto Online (GTAO) thing a few months ago.[0] GTAO took 5-15 minutes to start up. Like you click the icon and 5-15 minutes later you're in the main menu. Everyone was complaining about it for years. Years. Eventually some enterprising hacker disassembled the binary and profiled it. 95% of the runtime was in `strlen()` calls. Not only was that where all the time was spent, but it was all spent `strlen()`ing the exact same ~10MB resource string. They knew exactly how large the string was because they allocated memory for it, and then read the file off the disk into that memory. Then they were tokenizing it in a loop. But their tokenization routine didn't track how big the string was, or where the end of it was, so for each token it popped off the beginning, it had to `strlen()` the entire resource file.

The enterprising hacker then wrote a simple binary patch that reduced the startup time from 5-10 minutes to like 15 seconds or something.

To me that's profound. It implies that not only was management not concerned about the start up time, but none of the developers of the project ever used a profiler. You could just glance at a flamegraph of it, see that it was a single enormous plateau of a function that should honestly be pretty fast, and anyone with an ounce of curiousity would be like, ".........wait a minute, that's weird." And then the bug would be fixed in less time than it would take to convince management that it was worth prioritizing.

It disturbs me to think that this is the kind of world we live in. Where people lack such basic curiosity. The problem wasn't that optimization was hard, (optimization can be extremely hard) it was just because nobody gave a shit and nobody was even remotely curious about bad performance. They just accepted bad performance as if that's just the way the world is.

[0] Oh god it was 4 years ago: https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times...


I just started getting back into gaming and I'm seeing shit like this all the time. It's amazing that stuff like this is so common while the Quake fast inverse square root algo is so well known.

How is it that these companies spend millions of dollars to develop games and yet modders are making patches in a few hours fixing bugs that never get merged. Not some indie game, but AAA rated games!

I think you're right, it's on both management and the programmers. Management only knows how to rush but not what to rush. The programmers fall for the trap (afraid to push back) and never pull up a profiler. Maybe over worked and over stressed but those problems never get solved if no one speaks up and everyone is quiet and buys into the rush for rushing's sake mentality.

It's amazing how many problems could be avoided by pulling up a profiler or analysis tool (like Valgrind).

It's amazing how many millions of dollars are lost because no one ever used a profiler or analysis tool.

I'll never understand how their love for money makes them waste so much of it.


AAA games are, largely, quite bad in quality these days. Unfortunately, the desire to make a quality product (from the people who actually make the games) is overruled by the desire to maximize profit (from the people who pay their salaries). Indie games are still great, but I barely even bother to glance at AAA stuff any more.


  > by the desire to
An appropriate choice of words.

I'm just wondering if/when anyone will realize that often desire gets in the way of achieving. ̶T̶h̶e̶y̶ ̶m̶a̶y̶ ̶b̶e̶ ̶p̶e̶n̶n̶y̶ ̶w̶i̶s̶e̶ ̶b̶u̶t̶ ̶t̶h̶e̶y̶'̶r̶e̶ ̶p̶o̶u̶n̶d̶ ̶f̶o̶o̶l̶i̶s̶h̶.̶ Chasing pennies with dollars


That has been like that since there have been publishers in the games industry.

Back then, the indies stuff was only if you happened to live nearby someone you knew doing bedroom coding, distributing tapes on school, or they got lucky land their game on one of those shareware tapes collection.

Trying to actually get a publisher deal was really painful, and if you did, they really wanted their money back in sales.


Shareware tapes collection? Was there really such a thing? If so I would imagine it would be one or two demos per tape?


Yes there was such a thing, for those of us that leaved throught the 1980's.

There are tons of games that you can fit into 60m, 90m, or 180m tapes, when 48 KB/128 KB is all you got.

More like 20 or something.

Magazines like Your Sinclair and Crash would have such cassete tapes,

https://archive.org/details/YourSinclair37Jan89/YourSinclair...

https://www.crashonline.org.uk/

They would be glued into the magazine with adhesive tape, and later on to avoid them being stolen, the whole magazine plus tape would be in a plastic.


> To me that's profound. It implies that not only was management not concerned about the start up time, but none of the developers of the project ever used a profiler.

Odds are that someone did notice it during profiling and filed a ticket with the relevant team to have it fixed, which was then set to low priority because implementing the latest batch of microtransactions was more important.

I feel like this is just a natural consequence of the metrics-driven development that is so prevalent in large businesses nowadays. Management has the numbers showing them how much money they make every time they add a new microtransaction, but they don't have numbers showing them how much money they're losing due to people getting tired of waiting 15 minutes for the game to load, so the latter is simply not acknowledged as a problem.


iirc this bug existed from release but didn't impact the game until years later after a sizable number of DLCs were added to the online mode, since the function only got slower with each one added. Not that it's fine that the bug stayed in that long, but you can see how it would be missed given that when they had actual programmers running profilers at development time it wouldn't have raised any red flags after completing in ten seconds or whatever.


I don't know. As a developer there would be even more reason to be curious as to why the release binary is an order of magnitude slower then what is seen in development.


At release it was "working fine, same as in dev".

It slowed down gradually as the JSON manifest of optional content grew.


> It disturbs me to think that this is the kind of world we live in. Where people lack such basic curiosity. The problem wasn't that optimization was hard, (optimization can be extremely hard) it was just because nobody gave a shit and nobody was even remotely curious about bad performance. They just accepted bad performance as if that's just the way the world is.

The problem is, you don't get rewarded for curiosity, for digging down into problem heaps, or for straying out of line. To the contrary, you'll often enough get punished for not fulfilling your quota.


> and anyone with an ounce of curiousity would be like, ".........wait a minute"

I see what you did there ;)


> Another datapoint that supports your argument is the Grand Theft Auto Online (GTAO) thing a few months ago.[0] GTAO took 5-15 minutes to start up. Like you click the icon and 5-15 minutes later you're in the main menu. Everyone was complaining about it for years.

I see this is a datapoint, but not for your argument. This thing sat in the code base didn't cause problems and didn't affect sales of the game pre or post GTAO launch.

This sounds a lot like selection bias. You want to enhance airplanes that flew and returned. Rather than those that didn't come back.

Let's say they did the opposite and focused on improving this over a feature or a level from GTA. What level or what feature that you liked could you remove to make way for investigating and fixing this issue? Because at the end of the day - time is zero-sum. Everything you do comes at the expense of everything you didn't.


This is the sort of thing that, if fixed early enough in the development cycle, actually net brings forwards development. Because every time someone needs to test the game they hit the delay.

(which makes it all the more strange that it wasn't fixed)


> This is the sort of thing that, if fixed early enough in the development cycle

Is it? It didn't become noticable until GTA got a bunch of DLCs.

Sure someone might have spotted it. But it would take more time to spot it early, and that time is time not spent fixing bugs.


I think you have the logic backwards. You are saying it didn't cause problems, right? Well that's the selection bias. You're basing your assumption on what is more easily measurable. It's "not a problem" because it got sales, right? Those are the planes that returned.

But what's much harder to measure is the number of sales you missed. Or where the downed planes were hit. You don't have the downed planes, you can't see where they were hit! You just can't have that measurement, you can only infer the data through the survivors.

  > Because at the end of the day - time is zero-sum
Time is a weird thing. It definitely isn't zero sum. There's an old saying from tradesmen "why is there always time to do things twice but never time to do things right?" Time is made. Sometimes spending less time gives you more time. And all sorts of other weird things. But don't make the classic mistake of rushing needlessly.

Time is only one part of the equation and just like the body the mind has stamina. Any physical trainer would tell you you're going to get hurt if you just keep working one group of muscles and keep lifting just below your limit. It's silly that the idea is that we'd go from sprint to sprint. The game industry is well known to be abusive of its developers, and that's already considering the baseline developer isn't a great place to start from, even if normalized.


> But what's much harder to measure is the number of sales you missed. Or where the downed planes were hit. You don't have the downed planes, you can't see where they were hit! You just can't have that measurement, you can only infer the data through the survivors.

Not really. There are about 300 million gamers [1] if you exclude Androids and iPhones. How many sales units did GTA V make? 215 million[2]. It's a meteoric hit. They missed a sliver (35%) of their target audience.

You could argue that they missed the mobile market. But the biggest market - Android is a pain to develop for; the minimum spec for GTA V to have parity on phones would exclude a large part of the market (most likely), and the game itself isn't really mobile-friendly.

Ok, but we have a counter example (pun intended). Counter-Strike. Similarly, multiplayer, targets PCs mostly, developed by Valve, similarly excellent and popular to boot. However, it's way faster and way better optimized. So how much it "sold" according to [3]? 70 million. 86 if you consider Half-Life 1 and 2 as its single player campaign.

I'm not sure what the deciding factor for people is, but I can say it's not performance.

> Time is a weird thing. It definitely isn't zero sum.

If you are doing thing X, you can't do another thing Y, unless you are multitasking (if you are a time traveler, beware of paradoxes). But then you are doing two things poorly, and even then, if you do X and Y, adding other tasks becomes next to impossible.

It definitely is. Tim Cain had a video[4] about how they spent man months trying to find a cause for a weird foot sliding bug, that's barely noticeable, which they managed so solve. And at that time Diablo came out and it was a massive success with foot sliding up the wazoo. So, just because it bugs you doesn't mean others will notice.

> "why is there always time to do things twice but never time to do things right?"

Because you're always operating with some false assumption. You can't do it right, because the right isn't fixed and isn't always known, nor is it specified right for whom?

[1]https://www.pocketgamer.biz/92-of-gamers-are-exclusively-usi...

[2]https://web.archive.org/web/20250516021052/https://venturebe...

[3]https://vgsales.fandom.com/wiki/Counter-Strike

[4]https://youtu.be/gKEIE47vN9Y?t=651


  > They missed a sliver (35%) of their target audience.
Next time you're at a party go take a third of the cake and then tell everyone you just took "a sliver". See what happens...

Honestly, there's no point in trying to argue with you. Either you're trolling, you're greatly disconnected from reality, or you think I'm brain dead. No good can come from a conversation with someone that is so incorrigible.


> Next time you're at a party go take a third of the cake and then tell everyone you just took "a sliver". See what happens...

Fine, I'll concede it's the wrong word used. But:

> Honestly, there's no point in trying to argue with you. Either you're trolling, you're greatly disconnected from reality

Wait. I'm disconnected? Selling millions of unit (Half life) is amazing success and tens of millions is stellar success by any measure (Baldur's Gate, Call of Duty, Skyrim). But selling hundreds of millions (Minecraft, GTAV)? That's top 10 most popular game of all time.

So according to you, one of the top 5 best-selling game in history is somehow missing a huge part of the market? You can argue a plethora of things, but you can't speculate that GTA V could have done much better by saying "you're trolling"/"no point arguing".

And saying that optimizing the DLC JSON loader could have given them a bigger slice of the pie is incredulous at best.

You're extrapolating your preferences to 6 billion people. It's like watching a designer assume everyone will notice they used soft kerning, with dark grey font color on a fishbone paper background for their website. And that they cleverly aligned the watermark with the menu elements.


Honestly the GTA5 downloader/updater itself has pretty bad configuration. I wrote a post about it on Reddit years ago along with how to fix it.

I don't know if it's still applicable or not because I haven't played it for ages, but just in case it is, here's the post: https://www.reddit.com/r/GTAV/comments/3ysv1d/pc_slow_rsc_au...


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: