Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Even though they are all marketed as gaming cards, Nvidia is now very clearly differentiating between 5070/5070 Ti/5080 for mid-high end gaming and 5090 for consumer/entry-level AI. The gap between xx80 and xx90 is going to be too wide for regular gamers to cross this generation.


The 4090 already seemed positioned as a card for consumer AI enthusiast workloads. But this $1000 price gap between the 5080 and 5090 seems to finally cement that. Though we're probably still going to see tons of tech YouTubers making videos specifically about how the 5090 isn't a good value for gaming as if it even matters. The people who want to spend $2000 on a GPU for gaming don't care about the value and everyone else already could see it wasn't worth it.


From all the communication I’ve had with Nvidia, the prevailing sentiment was that the 4090 was an 8K card, that happened to be good for AI due to vram requirements from 8K gaming.

However, I’m a AAA gamedev CTO and they might have been telling me what the card means to me.


Well, modern games + modern cards can't even do 4k at high fps and no dlss. 8k story is totally fairy tale. Maybe "render at 540p, display at 8k"-kind of thing?

P.S. Also, VR. For VR you need 2x4k at 90+ stable fps. There's (almost) no vr games though


> modern games + modern cards can't even do 4k at high fps

What "modern games" and "modern cards" are you specifically talking about here? There are plenty of AAA games released last years that you can do 4K at 60fps with a RTX 3090 for example.


> There are plenty of AAA games released last years that you can do 4K at 60fps with a RTX 3090 for example.

Not when you turn on ray tracing.

Also 60fps is pretty low, certainly isn't "high fps" anyway


This.

You can't get high frame rates with path tracing and 4K. It just doesn't happen. You need to enable DLSS and frame gen to get 100fps with more complete ray and path tracing implementations.

People might be getting upset because the 4090 is WAY more power than games need, but there are games that try and make use of that power and are actually limited by the 4090.

Case in point Cyberpunk and Indiana Jones with path tracing don't get anywhere near 100FPS with native resolution.

Now many might say that's just a ridiculous ask, but that's what GP was talking about here. There's no way you'd get more than 10-15fps (if that) with path tracing at 8K.


> Case in point Cyberpunk and Indiana Jones with path tracing don't get anywhere near 100FPS with native resolution.

Cyberpunk native 4k + path tracing gets sub-20fps on a 4090 for anyone unfamiliar with how demanding this is. Nvidia's own 5090 announcement video showcased this as getting a whopping... 28 fps: https://www.reddit.com/media?url=https%3A%2F%2Fi.redd.it%2Ff...


> Also 60fps is pretty low, certainly isn't "high fps" anyway

I’m sure some will disagree with this but most PC gamers I talk to want to be at 90FPS minimum. I’d assume if you’re spending $1600+ on a GPU you’re pretty particular about your experience.


I’m so glad I grew up in the n64/xbox era. You save so much money if you are happy at 30fps. And the games look really nice.


You can also save tons of money by combining used GPUs from two generations ago with a patientgamer lifestyle without needing to resort to suffering 30fps


I wish more games had an option for N64/Xbox-level graphics to maximize frame rate. No eye candy tastes as good as 120Hz feels.


I’m sure you could do N64 style graphics at 120Hz on an iGPU with modern hardware, hahaha. I wonder if that would be a good option for competitive shooters.

I don’t really mind low frame rates, but latency is often noticeable and annoying. I often wonder if high frame rates are papering over some latency problems in modern engines. Buffering frames or something like that.


Doom 2016 at 1080p with a 50% resolution scale (so, really, 540p) can hit 120 FPS on an AMD 8840U. That's what I've been doing on my GPD Win Mini, except that I usually cut the TDP down to 11-13W, where it's hitting more like 90-100 FPS. It looks and feels great!


Personally I've yet to see a ray tracing implementation that I would sacrifice 10% of my framerate for, let alone 30%+. Most of the time, to my tastes, it doesn't even look better, it just looks different.


> Also 60fps is pretty low, certainly isn't "high fps" anyway

Uhhhhhmmmmmm....what are you smoking?

Almost no one is playing competitive shooters and such at 4k. For those games you play at 1080p and turn off lots of eye candy so you can get super high frame rates because that does actually give you an edge.

People playing at 4k are doing immersive story driven games and consistent 60fps is perfectly fine for that, you don't really get a huge benefit going higher.

People that want to split the difference are going 1440p.


Anyone playing games would benefit from higher frame rate no matter their case. Of course it's most critical for competitive gamers, but someone playing a story driven FPS at 4k would still benefit a lot from framerates higher than 60.

For me, I'd rather play a story based shooter at 1440p @ 144Hz than 4k @ 60Hz.


You seem to be assuming that the only two buckets are "story-driven single player" and "PvP multiplayer", but online co-op is also pretty big these days. FWIW I play online co-op shooters at 4K 60fps myself, but I can see why people might prefer higher frame rates.


Games other than esports shooters and slow paced story games exist, you know. In fact, most games are in this category you completely ignored for some reason.

Also nobody is buying a 4090/5090 for a "fine" experience. Yes 60fps is fine. But better than that is expected/desired at this price point.


This - latest Call of Duty game on my (albeit water cooled) 3080TI founders edition saw frame rates in the 90-100fps running natively at 4k (no DLSS).


Can't CoD do 60+ fps @1080p on a potato nowadays?... not exactly a good reference point.


4k90 is about 6 times that, and he probably has the options turned up.

I’d say the comparison is what’s faulty, not the example.


new cod is really unoptimized. on a few years old 3080 still getting 100 fps on 4k that's pretty great. if he uses some frame gen such as lossless he can get 120-150. Say what you will about nvidia prices but you do get years of great gaming out of them.


Honestly my water cooled 3080TI FE has been great. Wish it had more VRAM for VR (DCS, MSFS) but otherwise it’s been great.


Which block did you go with? I went with the EK Vector special edition which has been great, but need to look for something else if I upgrade to 5080 with their recent woes.


I just have the Alphacool AIO with a second 360 radiator.

I’ve done tons of custom stuff but was at a point where I didn’t have the time for a custom loop. Just wanted plug and play.

Seen some people talking down the block, but honestly I run 50c under saturated load at 400 watts, +225 core, +600 memory with a hot spot of 60c and VRAM of 62c. Not amazing but it’s not holding the card back. That’s with the Phanteks T30’s at about 1200RPM.

Stock cooler I could never get the card stable despite new pads and paste. I was running 280 watts, barely able to run -50 on the core and no offset on memory. That would STILL hit 85c core, 95c hotspot and memory.


Yep. Few AAA games can run at 4K60 at max graphics without upscaling or frame gen on a 4090 without at least occasionally dipping below 60. Also, most monitors sold with VRR (which I would argue is table stakes now) are >60FPS.


The 4080 struggles to play high end games at 4k and there aren't that many 8k tvs/monitors in the market... Doesn't make much sense that anyone would think about the 4090 as an 8k GPU to be honest.


I recall them making the same claims about the 3090:

https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3090-8...


Seems kinda silly to make an 8K video card when ... nobody on the planet has an 8K screen


Perhaps you don't, but several of us do. They've been around a while, available in your local bestbuy/costco if you're rocking a 4:4:4 TV they're not even particularly pricey and great for computing (depending on the subpixel layout).

On the planet? Many people. Maybe you're thinking 12K or 16K.


It's been a few years since I worked at [big tech retailer], but 8K TVs basically didn't sell at the time. There was basically no native content - even the demos were upscaled 4K - and it was very hard to tell the difference between the two unless you were so close to the screen that you couldn't see the whole thing. For the content that was available, either you were dealing with heavy compression or setting up a high-capacity server, since file sizes basically necessitated most of the space on what people would consider a normal-sized hard drive to store just a few movies.

The value just wasn't there and probably won't ever be for most use cases. XR equipment might be an exception, video editing another.


I got 4K TVs for both of my kids, they're dirt cheap-- sub $200. I'm surprised the Steam hardware survey doesn't show more. A lot of my friends also set their kids up on TVs, and you can't hardly buy a 1080P TV anymore.


Does Steam hardware survey show the resolution of your usual desktop, or your gaming resolution? eg I run at 4k in Windows normally, but quite often run games at 1080p.


I'd bet it's either the native display resolution or whatever you had for your desktop when submitted. They're able to gather all kinds of hardware specs so I'd lean to the native resolution as the most likely answer.


2018 (6 years ago): https://www.techradar.com/reviews/dell-ultrasharp-up3218k

It's uncommon, sure, but as mentioned it was sold to me as being a development board for future resolutions.


> Seems kinda silly to make a 4K video card when ... nobody on the planet has a 4K screen.

Someone else probably said that years ago when everyone was rocking 1080/1440p screens.


If you look at the Steam hardware survey you’ll find the majority of gamers are still rocking 1080p/1440p displays.

What gamers look for is more framerate not particularly resolution. Most new gaming monitors are focusing on high refresh rates.

8K feels like a waste of compute for a very diminished return compared to 4K. I think 8K only makes sense when dealing with huge displays, I’m talking beyond 83 inches, we are still far from that.


Gaming aside, 4K is desirable even on <30" displays, and honestly I wouldn't mind a little bit more pixel density there to get it to true "retina" resolution. 6K might be a sweet spot?

Which would then imply that you don't need a display as big as 83" to see the benefits from 8K. Still, we're talking about very large panels here, of the kind that wouldn't even fit many computer desks, so yeah...


First consumer 4K monitors came out more than a decade ago. I think the Asus PQ321 in 2013. That’s close to where we are now with 8K.

How many of the cards of that time would you call “4K cards”? Even the Titan X that came a couple of years later doesn’t really cut it.

There’s such a thing as being too early to the game.


Gaming isn't the only use-case, but Steam hardware survey says ~4% of users are using 4k screens. So the market is still small.


Why does 8K gaming require more VRAM?

I think the textures and geometry would have the same resolution (or is that not the case? but in 4K if you walk closer to the wall you'd want higher texture resolution as well anyway, if the graphics artists have made the assets at that resolution anyway)

8K screen resolution requires 132 megabytes of memory to store the pixels (for 32-bit color), that doesn't explain gigabytes of extra VRAM

I'd be curious to know what information I'm missing


My understand is between double buffering and multiple sets of intermediate info for shaders, you usually have a bunch of screen size buffers hanging around in VRAM, though you are probably right that these aren't the biggest contributor to VRAM usage in the end.


Shadow maps are a good example, if the final rendered image is 4k you don't want to be rendering shadow maps for each light source which are only 1080p else your shadows will be chunkier.


You’re only thinking of the final raster framebuffer, there are multiple raster and shader stages. Increasing the native output has an nearly exponential increase in memory requirements.


When you render a higher resolution natively, you typically also want higher resolution textures and more detailed model geometry.


I do recall an 8K push but I thought that was on the 3090 (and was conditional on DLSS doing the heavy lifting). I don't remember any general marketing about the 4090 being an 8K card but I could very well have missed it or be mixing things up! I mean it does make sense to market it for 8K since anyone who is trying to drive that many pixels when gaming probably has deep pockets.


I recall the 3090 8K marketing too. However, I also recall Nvidia talking about 8K in reference to the 4090:

https://www.nvidia.com/en-us/geforce/technologies/8k/

That said, I recall that the media was more enthusiastic about christening the 4090 as an 8K card than Nvidia was:

https://wccftech.com/rtx-4090-is-the-first-true-8k-gaming-gp...


If I recall correctly, the 3090, 3090 Ti and 4090 were supposed to replace the Titan cards that had been Nvidia's top gaming cards, but were never meant for gaming.


Someone very clever at Nvidia realized that if they rename their professional card (Titan) to be part of their "gaming" line, you can convince adults with too much disposable income that they need it to play Elden Ring.

I didn't know of anyone who used the Titan cards (which were actually priced cheaper than their respective xx90 cards at release) for gaming, but somehow people were happy spending >$2000 when the 3090 came out.


As an adult with too much disposable income and a 3090, it just becomes a local LLM server w/ agents when I'm not playing games on it. Didn't even see the potential for it back then, but now I'm convinced that the xx90 series offers me value outside of just gaming uses.


>but somehow people were happy spending >$2000 when the 3090 came out

Of course they did, the 3090 came out at the height of the pandemic and crypto boom in 2020, when people were locked indoors with plenty of free time and money to spare, what else where they gonna spend it on?


I wonder if these will be region-locked (eg, not for HK SAR).


The only difference is scalar. That isn't differentiating, that's segregation.

It won't stop crypto and LLM peeps from buying everything (one assumes TDP is proportional too). Gamers not being able to find an affordable option is still a problem.


>Gamers not being able to find an affordable option is still a problem.

Used to think about this often because I had a side hobby of building and selling computers for friends and coworkers that wanted to get into gaming, but otherwise had no use for a powerful computer.

For the longest time I could still put together $800-$1000 PC's that could blow consoles away and provide great value for the money.

Now days I almost want to recommend they go back to console gaming. Seeing older ps5's on store shelves hit $349.99 during the holidays really cemented that idea. Its so astronomically expensive for a PC build at the moment unless you can be convinced to buy a gaming laptop on a deep sale.


One edge that PCs have is massive catalog.

Consoles have historically not done so well with backwards compatibility (at most one generation). I don't do much console gaming but _I think_ that is changing.

There is also something to be said about catalog portability via something like a Steam Deck.


Cheaper options like the Steam Deck are definitely a boon to the industry. Especially the idea of "good enough" gaming at lower resolutions on smaller screens.

Personally, I just don't like that its attached to steam. Which is why I can be hesitant to suggest consoles as well now that they have soft killed their physical game options. Unless you go out of your way to get the add-on drive for PS5, etc

Its been nice to see backwards compatibility coming back in modern consoles to some extent with Xbox especially if you have a Series-X with the disc drive.

I killed my steam account with 300+ games just because I didn't see a future where steam would actually let me own the games. Repurchased everything I could on GoG and gave up on games locked to Windows/Mac AppStores, Epic, and Steam. So I'm not exactly fond of hardware attached to that platform, but that doesn't stop someone from just loading it up with games from a service like GoG and running them thru steam or Heroic Launcher.

2024 took some massive leaps forward with getting a proton-like experience without steam and that gives me a lot of hope for future progress on Linux gaming.


>Unless you go out of your way to get the add-on drive for PS5

Just out of interest, if I bought a PS5 with the drive, and a physical game, would that work forever (just for single-player games)?

Like you, I like to own the things I pay for, so it's a non-starter for me if it doesn't.


In my experience it varies on a game by game basis. Some games have limitations (ie. Gran Turismo 7 having only Arcade mode offline)


Are crypto use cases still there? I thought that went away after eth switched their proof model.


Bitcoin is still proof of work.


Yeah but BTC is not profitable on GPU I thought (needs ASIC farms)


Yup, the days of the value high end card are dead it seems like. I thought we would see a cut down 4090 at some point last generation but it never happened. Surely there's a market gap somewhere between 5090 and 5080.


The xx90 cards are really Titan cards. The 3090 was the successor to the Titan RTX, while the 3080 Ti was the successor to the 2080 Ti, which succeeded the 1080 Ti. This succession continued into the 40 series and now the 50 series. If you consider the 2080 Ti to be the "value high end card" of its day, then it would follow that the 5080 is the value high end card today, not the 5090.


In all those historical cases the second tier card was a cut down version of the top tier one. Now the 4080 and 5080 are a different chip and there's a gulf of a performance gap between them and the top tier. That's the issue I am highlighting, the 5080 is half a 5090, in the past a 3080 was only 10% off a 3090 performance wise.


It was not actually. The last time this was the case was Maxwell:

https://www.techpowerup.com/gpu-specs/nvidia-gm200.g772

Beginning with Pascal, Nvidia’s top GPU was not available in consumer graphics cards:

https://www.techpowerup.com/gpu-specs/nvidia-gp100.g792

Turing was a bit weird since instead of having a TU100, they instead had Volta’s GV100:

https://www.techpowerup.com/gpu-specs/nvidia-tu102.g813

https://www.techpowerup.com/gpu-specs/nvidia-gv100.g809

Then there is Ampere’s GA100 that never was used in a consumer graphics card:

https://www.techpowerup.com/gpu-specs/nvidia-ga100.g931

Ada was again weird as instead of a AD100, it had the GH100:

https://www.techpowerup.com/gpu-specs/nvidia-ad102.g1005

https://www.techpowerup.com/gpu-specs/nvidia-gh100.g1011

Now with Blackwell the GB100 is the high end one that is not going into consumer cards. The 5090 gets GB202 and the 5080 gets GB203.

Rather than the 40 series and 50 series putting the #2 GPU die into the #2 consumer card, they are putting the #3 GPU die into the #2 consumer card.


This is not relevant to what is being discussed. I clearly mean top tier consumer GPU.

3080/3090 - Same die

2080 ti/Titan RTX - Same die

1080 ti/Titan Xp - Same die

980 ti/Titan X - Same die

780/Titan - Same die

670/680 - Same die

570/580 - Same die

470/480 - Same die


Yes, but Nvidia thinks enough of them get pushed up to the 5090 to make the gap worthwhile.

Only way to fix this is for AMD to decide it likes money. I'm not holding my breath.


AMD announced they aren't making a top tier card for the next generation and is focusing on mid-tier.

Next generation, the are finally reversing course and unifying their AI and GPU architectures (just like nVidia).

2026 is the big year for AMD.


AMD's GPU marketing during CES has been such a shit show. No numbers, just adjectives and vibes. They're either hiding their hand, or they continue to have nothing to bring to the table.

Meanwhile their CPU marketing has numbers and graphs because their at the top of their game and have nothing to hide.

I'm glad they exist because we need the competition, but the GPU market continues to look dreary. At least we have a low/mid range battle going on between the three companies to look forward to for people with sensible gaming budgets.


Don't necessarily count Intel out.


Intel is halting its construction of new factories and mulling over whether to break up the company...


Intel's Board is going full Kodak.


I wouldn't count Intel out in the long term, but it'll take quite a few generations for them to catch up and who knows what the market will be like by then


Intel hate making money even more than AMD.


Intel's Arc B580 budget card is selling like hotcakes... https://www.pcworld.com/article/2553897/intel-arc-b580-revie...


They fired the CEO for daring to make a product such as this. The 25mil they paid to get rid of him might even wipe out their profits on this product.


Starting around 2000, Intel tried to make money via attempts at everything but making a better product (pushing RAMBUS RAM, itanium, cripling low-end chips more than they needed to be, focusing more on keeping chip manufacturing in-house thereby losing out on economy of scale). The result was engineers were (not always, but too often) nowhere near the forefront of technology. Now AMD, NVIDIA, ARM are all chipping away (pun intended).

It's not dissimilar to what happened to Boeing. I'm a capitalist, but the current accounting laws (in particular corporate taxation rules) mean that all companies are pushed to use money for stock buybacks than R&D (which Intel spent more on the former over the latter over the past decade and I'm watching Apple stagnate before my eyes).


You underestimate how many gamers got a 4090.


Nvidia is also clearly differentiating the 5090 as the gaming card for people who want the best and an extra thousand dollars is a rounding error. They could have sold it for $1500 and still made big coin, but no doubt the extra $500 is pure wealth tax.

It probably serves to make the 4070 look reasonably priced, even though it isn't.


Gaming enthusiasts didn't beat an eye at 4090 price and won't beat one there either.

4090 was already priced for high income (in first world countries) people. Nvidia saw 4090s were being sold on second hand market way beyond 2k. They merely milking the cow.


Double the bandwidth, double the ram, double the pins, and double the power isn't cheap. I wouldn't be surprised if the profit on the 4090 was less than the 4080, especially since any R&D costs will be spread over significantly less units.


There have been numerous reports over the years that the 4090 actually outsold the 4080.


The 4080 was also quite the bad value compared to the much better 4090. That remains to be seen for the 5000 series.


The 4080 was designed as a strawman card expressly to drive sales towards the 4090. So this is by design.


Leaks indicate that the PCB has 14 layers with a 512-bit memory bus. It also has 32GB of GDDR7 memory and the die size is expected to be huge. This is all expensive. Would you prefer that they had not made the card and instead made a lesser card that was cheaper to make to avoid the higher price? That is the AMD strategy and they have lower prices.


That PCB is probably a few dollars per unit. The die is probably the same as the one in the 5070. I've no doubt it's an expensive product to build, but that doesn't mean the price is cost plus markup.


Currently, the 5070 is expected to use the GB205 die while the 5090 is expected to use the GB202 die:

https://www.techpowerup.com/gpu-specs/geforce-rtx-5070.c4218

https://www.techpowerup.com/gpu-specs/geforce-rtx-5090.c4216

It is unlikely that the 5070 and 5090 share the same die when the 4090 and 4080 did not share same die.

Also, could an electrical engineer estimate how much this costs to manufacture:

https://videocardz.com/newz/nvidia-geforce-rtx-5090-pcb-leak...


Is the last link wrong? It doesn't mention cost.


The PCB cost did not leak. We need an electrical engineer to estimate the cost based on what did leak.


>That PCB is probably a few dollars per unit.

It’s not. 14L PCB are expensive. When I looked at Apple cost for their PCB it was probably closer to $50, and they have smaller area


The price of a 4090 already was ~1800-2400€ where I live (not scalper prices, the normal online Shops)

We'll have to see how much they'll charge for these cards this time, but I feel like the price bump has been massively exaggerated by people on HN


MSRP went from 1959,- to 2369,-. That's quite the increase.


The 4090 MSRP was 1600

https://www.nvidia.com/en-us/geforce/graphics-cards/40-serie...

The cards were then sold around that ballpark you said, but that was because the shops could and they didn't say no to more profit.

We will have to wait to see what arbitrary prices the shops will set this time.

If they're not just randomly adding 400+ on top, then the card would cost roughly the same.


How will a 5090 compare against project digits? now that they're both in the front page :)


We will not really know until memory bandwidth and compute numbers are published. However, Project Digits seems like a successor to the NVIDIA Jetson AGX Orin 64GB Developer Kit, which was based on the Ampere architecture and has 204.8GB/sec memory bandwidth:

https://www.okdo.com/wp-content/uploads/2023/03/jetson-agx-o...

The 3090 Ti had about 5 times the memory bandwidth and 5 times the compute capability. If that ratio holds for blackwell, the 5090 will run circles around it when it has enough VRAM (or you have enough 5090 cards to fit everything into VRAM).


Very interesting, thanks!

32gb for the 5090 vs 128gb for digits might put a nasty cap on unleashing all that power for interesting models.

Several 5090s together would work but then we're talking about multiple times the cost (4x$2000+PC VS $3000)


Inference presumably will run faster on a 5090. If the 5x memory bandwidth figure holds, then token generation would run 5 times faster. That said, people in the digits discussion predict that the memory bandwidth will be closer to 546GB/sec, which is closer to 1/3 the memory bandwidth of the 5090, so a bunch of 5090 cards would only run 3 times faster at token generation.


Don't forget that you can link for example two 'Digits' together (~256 GB) if you want to run even larger models or have larger context size. That is 2x$3000 vs 8x$2000.

This will make it possible for you to run models up to 405B parameters, like Llama 3.1 405B at 4bit quant or the Grok-1 314B at 6bit quant.

Who knows, maybe some better models will be released in the future which are better optimized and won't need that much RAM, but it is easier to buy a second 'Digits' in comparison to building a rack with 8xGPUs. For example, if you look at the latest Llama models, Meta states: 'Llama 3.3 70B approaches the performance of Llama 3.1 405B'.

To interfere with Llama3.3-70B-Instruct with ~8k context length (without offloading), you'd need: - Q4 (~44GB): 2x5090; 1x 'Digits' - Q6 (~58GB): 2x5090; 1x 'Digits' - Q8 (~74GB): 3x5090; 1x 'Digits' - FP16 (~144GB): 5x5090; 2x 'Digits'

Let's wait and see which bandwidth it will have.


> bandwidth

Speculation has it at ~5XXgb/s.

agreed on the memory.

if I can I'll get a few but I fear they'll sell out immediately


Kind of wondering if nVidia will pull a Dell and copy Apple renaming

5070, 5070 Ti, 5080, 5090 to

5000, 5000 Plus, 5000 Pro, 5000 Pro Max.

:O


The 3090 and 3090 Ti both support software ECC. I assume that the 4090 has it too. That alone positions the xx90 as a pseudo-professional card.


The 4090 indeed does have ecc support


Yes, but ECC is inline, so it costs bandwidth and memory capacity.


Doesn't it always. (Except sometimes on some hw you can't turn it off)


I believe the cards that are intended for compute instead of GPU default to ECC being on and report memory performance with the overheads included.


Anything with DDR5 or above has built in limited ECC... it's required by the spec. https://www.corsair.com/us/en/explorer/diy-builder/memory/is...


Sure, but it's very limited. It doesn't detect or fix errors in the dimm (outside the chips), motherboard traces, CPU socket, or CPU.


It’s the same pricing from last year. This already happened.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: