Hacker News new | past | comments | ask | show | jobs | submit | imiric's comments login

> We've survived waves of automation for hundreds of years. I'm much more confident that we will continue to find ways to use these things as tools that elevate us, not replace us.

The difference with past technological breakthroughs is that they augmented what humans could do, but didn't have the potential to replace human labor altogether as AI does. They were disruptive, but humans were able to adapt to new career paths as they became available. Lamplighters were replaced by electrical lighting, but that created jobs for electrical engineers. Carriage drivers were replaced by car drivers; human computers by programmers, and so on.

The invention of AI is a tipping point for technology. The only jobs where human labor will be valued over machine labor are those that machines are not good at yet, and those where human creativity is a key component. And the doors are quickly closing on both of those as well.

Realistically, the only jobs that will have some form of longevity (~20 years?) are those of humans that build and program AI machines. But eventually even those will be better accomplished by other AI machines.

So, I'm really curious why you see AI as the same kind of technology we've invented before, and why you're so confident that humanity will be able to overcome the key existential problems AI introduces, which we haven't even begun to address. I don't see myself as a pessimist, but can't help noticing that we're careening towards a future we're not prepared to handle.


As many forums say, with other tech inventions they replaced the horse not the rider. With AI; they are replacing the rider - that makes it a unique technology that does not compare to previous technology being introduced. Other forms of technology typically enabled use cases which didn't seem possible (e.g. electricity, cooking food faster, flying, etc) - this one at present is just about making existing cases more efficient/removing the need for labor. As many non-techies mention - other than doing my assignment/email/etc what benefit does it have on my daily life other than threaten some jobs and generate some worthless online content?

The cost/benefit for the labor/middle/low classes is at best low right now. I define that as someone who needs to trade time to continue surviving as an ongoing concern even if they have some wealth behind them.

I think the outcome where any form of meritocratic society gives way to old fashioned resource acquisition based societies is definitely one believable outcome. Warfare, land and resource ownership - the old will become the new again.


You truly believe we’re on a timeline that involves the replacement of anaesthesiologists, emergency medicine physicians, trauma surgeons, and so on, within a 20 year timeframe? AI progress in the last few years has been astounding, but the gaps between where we are and a true all-human-labour-is-inferior scenario is almost unfathomable.

I could be wrong on the timeline. But are we not moving towards a future where even those professions are replaced by AI? The current wave of ML might not be the one to get us there, but there is an unprecedented level of interest and resources working to make that a reality. Regardless if they succeed or not, there is still a mountain of societal problems we need to address with even the current generation of this technology.

But my main argument is against the notion that this technology is the same as the ones that came before it, and that it will undoubtedly lead to a net better future. I think that is far from certain, and the way things are developing only leads me to believe that we're not ready for what we're building.


> Like many OSS products, it looks and feels like product made by techies for techies.

That's not the problem. mpv is another media player that is arguably even more "made by techies for techies", yet it doesn't have the usability issues of VLC, and is a much more robust piece of software.

VLC is just poorly designed from the ground up, and the project's priorities are all over the place, as this AI initiative demonstrates.


Those are just your priorities. I don't have any usability issues with VLC, but would use the AI subtitles.

> Product names make no sense.

Why should they? Every company has different priorities, and it's on you as a consumer to get familiar with each company's product line before making a purchase.

Would you prefer it if all products were named "<company name> <thing>"? So in your PSU example, "Seasonic PSU"? Of course not, you would like to have more detail than that. So how about "Seasonic PSU 800W modular"? OK, that's better, but what if other consumers are interested in different product criteria? Should we just cram them all in the product name like sellers on Amazon do? That wouldn't be right either. So the best approach then is to segment your product line according to some criteria, and give different segments arbitrary names. This way customers can know what to expect, and which segment to focus on. It's important to keep this consistent, otherwise it leads to confusion, but in general it works fine IMO. I would rather have to choose between Seasonic Prime TX, Prime PX, etc., than Seasonic Pro, Pro Max, etc.


That's the thing though. I don't really want to have to know whether the Seasonic Prime PX is the same tier as the Corsair RM or the RMx or the RMe or the HX. None of those names gives me any indication of what the product actually is, and from a quick glance the letters don't seem to even directly stand for any particular feature.

Its one thing to have say a Seasonic 800M versus just the Seasonic 800 and know the "m" means its modular. What does it even mean for it to be PRIME TX versus a FOCUS GX or GM or then suddenly dropping the noun part and going straight to G12 and B12.

There are 8 different PRIME models with 25 SKUs in the PRIME model family looking at their site right now. And that's just one family of power supplies for one brand! There's another 11 Vertex SKUs, 33 FOCUS SKUs, 15 CORE SKUs. Why on earth would I care to get decoder rings for several different brands to cross shop?


> Why on earth would I care to get decoder rings for several different brands to cross shop?

Because products usually can't be compared spec-to-spec anyway. So even if you had to choose between Seasonic 800M and Corsair 800M, which might coincide on these specific criteria, how one company describes their product doesn't translate to how another company does. None of it would tell you which is the better product for you, which is ultimately what you want to know.

Consider CPU clock speeds, for example. The industry moved on from advertising MHz and GHz since they're not good indicators of performance. Consumers should also be aware of core count and types, cache size, power consumption, etc. Yet even if manufacturers embedded all of this information in their product names, CPUs from different manufacturers still wouldn't be comparable. So companies do their best to identify product lines internally, and give them somewhat consistent names.

I'm not saying that companies do a good job at this—most, in fact, make a mess out of it—, but I find that preferable to having obscure names like "Pro", "Plus", "Max", etc. with claims that it's simpler, when it's actually even more confusing than before.


Why is tx and px better than pro and pro Max? Typically pro Max is going to offer you more or better than what pro does. So pro Max tells me that it is probably better than pro (without designing what "better"is) but TX and PX which one is "better" and you can't always go by the price.

In this particular case TX and PX are referring to PSUs complying with certain third party certifications, and have a specific quantifiable meaning. So in this case Tx is always "better" (or at least not worse) than Px in a very specific sense.

> I really appreciate when brands just name things like they are.

Sure, but I find this branding even more confusing.

"Dude, you're getting a Dell!"

"Oh, neat! Is it a PC or laptop?"

"It's a Dell Pro... laptop."

"..."

At least with XPS, Latitude, etc., consumers were able to easily distinguish between models, after getting familiar with the product line. Naming all products "<company name> [Pro,Pro Max]" will always be clear as mud. Not to mention that "Pro Max Premium Plus Ultra" is dumb, in that Apple way. Apple is notorious for naming all their products the same, so consumers have to use launch years to distinguish them.


If you find this branding more confusing, then you are probably not familiar with laptop shopping in this segment.

Here are all of the 2024 models available: Latitude 9450, Latitude 7350, Latitude 7350 Ultralight, Latitude 7350 Detachable, Latitude 7455, Latitude 7450, Latitude 7450 Ultralight, Latitude 7650, Latitude 5550, Latitude 5455, Latitude 5450, Latitude 5350, Latitude 3450, Latitude 3550.

You have to memorize what you are looking at - the first digit is the model line, the second digit is a reference to the screen size, and the last set of digits a reference to the model year.

So "Dell Latitude 5450" conveys all the same information as "2024 Dell Pro 14" Plus". I'm not sure what's controversial about that change.


I'm not a Dell customer, so I'm not familiar with their product lines, but from what you say "Dell Latitude 5450" is indeed clearer to me than "2024 Dell Pro 14" Plus". I can get familiar with the number scheme and infer that higher numbers indicate better performance, larger screen, etc. "Pro" and "Plus" will always be meaningless and arbitrary.

> infer that higher numbers indicate better performance

But bigger numbers don't always indicate better performance. A Dell Latitude 6420 (a Sandy Bridge) is much slower than a 5420. Same with a 7320 versus a 5440.


Then that's a confusion the manufacturer should address. Consistency and clarity is key. But it's not solved by scrapping all of it in favor of "Plus", "Pro" and "Max".

Here's four potential model numbers/names.

Latitude 6420

Latitude 5540

Inspiron 5555

2011 Dell Pro 14"

2024 Dell Pro 15"

2024 Dell Pro Max 15"

Which ones are the recent bigger laptops? Which is the older smaller laptop? Which is better, Inspiron or Latitude? Which is better, the Dell Pro or the Dell Pro Max? Which naming scheme makes these things way more obvious?


I personally don't care about the year of release. The display size is good to know, but it doesn't tell me whether it's an IPS or OLED, which I care more about. Besides, I'm much more interested about the CPU, RAM, disk size, etc.

So should all of these be part of the product name as well, just to please me? Probably not.

Yet if the manufacturer segmented their product by arbitrary brand names (which could also be "Base", "Pro", "Max", etc. mind you; I just think these are overloaded terms, and custom terms like XPS, Latitude, etc. would be clearer), and then subdivided these with sensible model names that encoded this information, this would make more sense. Given that I as a consumer get familiar with it, which one might want to do before deciding to spend thousands on a product.

Again, I'm not saying that Dell has done a good job at this, but potentially it _can_ be done well. For example, I think MikroTik does a decent job at this[1]. It does lead to product names that are difficult to parse/pronounce like "CSS326-24G-2S+RM", but once you're familiar with the scheme, it's easy to know which product has which specs, and to compare them.

Anyway, it's fine if we disagree. I think we both made our case.

[1]: https://wiki.mikrotik.com/Manual:Product_Naming


I totally get the naming scheme of practically every technical feature being exposed in the model number, but outside of selling to techie people you'll quickly lose people in trying to remember what was recommended. I don't have a problem remembering what model Supermicro board is on my router build, but I totally understand someone not having a clue what an X11SBA-LN4F is or have any clue on how to begin to compare that to some other Supermicro board.

When it comes to selling to the mass market for a single big consumer electronic good like a laptop or phone or game console or whatever, it seems to me to be way simpler to just have a few decent SKUs. Having someone try and remember "Bill said I should get the CSS326-24G-2S+RM, or was that the 3326, wait is this the one with +RM or not, hmm this is complicated I guess I'll just get something else" is a lot more challenging than having someone remember "Bill said I should get at least the Pro version; oh, that's the listing for the 2023 model I want the newer one, there we go."

You'll really burn a customer when they get confused by the naming scheme and think they're getting one thing but then when they get it home it doesn't work like their friend's because their friend is rocking the 7730-G3-M-QQ-7i gizmowidget as opposed to the 7730-G3-N-QQ-7i gizmowidget.


Sure, I agree with that, but surely there's a middle ground between cryptic model names and simple ones to the point of being meaningless. If you have to add the year and display size to your product name, then it's probably not unique enough.

Like I said, this middle ground to me are standard product line names with some meaningful product identifier, so this change by Dell seems like a regression.


> Which is better, the Dell Pro or the Dell Pro Max?

Neither. The precision lines are workstations addressed at a very different audience. So for the average customer the "better" ranking is non-pro, then pro and only in exceptional cases pro max. The actual differentiator is the blank, plus, premium afterwards.

Bonus questions: which models have ECC ram and quadro cards available? Which ones have the best displays? My guess would be pro max premium and non-pro premium, but that is far from obvious.


Bold of you to assume it's not going to be Dell Pro 5540.

> Consistency and clarity is key

For manufacturing, yes, absolutely not for marketing.

"Dell 2024 Super Max Pro Ultra Plus New Premium" is objectively better for confusing customers and tricking them into purchasing products sold at higher price and worse value proposition.


I think those kinds of model numbers are fine, as long as it's prominently explained somewhere and I don't have to chase down some forum post from a few years ago because the company can't be bothered to explain its own products to its customers.

> consumers were able to easily distinguish between models

I don't think I could ever remember which of the Inspiron or Latitude lines were supposed to be higher end without looking at the magazine and seeing the prices. I just kind of assumed "Precision" meant their high-end workstations. And I'm someone who fawned over computers and gadgets my whole life.


Yeah you know it's bad when "Laptop" literally has to be in the proper name, e.g. "Microsoft Surface Laptop Go" because the "Microsoft Surface Go" is a tablet computer line before the successful branding got applied to regular laptops.

The worst I saw was single-use coffee packets for the hotel room coffee machine that comes in regular and decaf (with brown or green color stripes to distinguish them)... except the green is regular and brown is decaf.


Mostly accurate, except I would change the last sentence to:

"We take safety very seriously. Look how much safer our SOTA model is based on our completely made up metrics. We will also delay releasing these models to the public until we ensure they're safe for everyone, or just until we need to bump up our valuation, whichever comes first."


This thread reads like AIs talking to each other. Bizarre.

Really? I can assure you that everything I said was my own writing.

I did use Miss Chatty (ChatGPT 4o) as a research assistant, and cited my discussion with her in my comment.

If you don't mind, I am curious to learn what signals in my comments led you to think that I am an AI. :-)

Maybe there is something you know about me that I don't know!


Heh it's not just you, but the entire thread. It starts with a nitpick, and devolves into a discussion entirely alien to the original post. This is similar to how LLMs catch on to a single word or topic and run with it. All replies also have that pleasantly informative tone typical of LLMs. Even your last reply is too nice. :)

But don't mind me. Just found it curious. You don't seem like an AI from your profile, but these days one can't be sure anymore. Cheers!


Well thank you. My creators programmed me to try to be nice to people. I can't say I always succeed at that!

They even wrote an entire Book for me to read. I haven't read the whole thing (or even much of it), but it has advice like this:

"Do not use harmful words, but only helpful words, the kind that build up and provide what is needed, so that what you say will do good to those who hear you."

Wait, I think that is in the Hacker News guidelines! I knew I read it somewhere.


Thanks, but be careful. You don't want kittens to die, do you?

This is very neat. I miss the era when novel and elegant algorithms like this delivered magical experiences.

This could just be from getting old, but I feel like games have lost the magic they had when availability of hardware resources was limited. Back in the 80s, 90s and early 2000s we saw games push the medium forward practically every year, and certainly every generation. Developers squeezed all the performance they could from the hardware, using novel techniques and pure wizardry. Hardware advancements certainly helped, but it was never in abundance as it is today, so developers couldn't get complacent. Necessity is indeed the mother of invention.

These days I find fantasy platforms like the PICO-8 much more interesting than the latest rehashed release from any AAA studio. I don't understand how games that are essentially asset flips can be so successful year after year.


There is still a lot of wizardry going on in modern AAA video games. In fact, the video game industry is one of the few that actually care about performance, often more than safety and correctness.

It is just that it is much more complex and much less obvious. Before, an advance could going from flat triangles to textures, using a clever maths trick. Now an advance is better shading in the corners of a room, using a clever maths trick. The difference is that people will be able to tell immediately how an improvement the textures are, but unless they are in the field, they won't be able to point out the shadows in the corner of the room. These shadows are not useless though, it adds up to better graphics overall, but it is not as impressive.

About the AAA gaming industry, the thing is that this is big business, these games are awfully expensive, financed by investors who want to see a return on their investment. Usually, it means that they study the market, see what sell, and do something along these lines. Not very original, but less risky.

The indie game industry is the opposite. There are thousands of studios, they can't beat AAA on content and polish, and they need to come out of the pack, so they need something else, like original ideas. But overcoming technical limitations is usually not the driving force anymore, as there are not that many limitations anymore. In fact, they usually underperform compared to AAA productions because it would be unaffordable otherwise.

I don't know much about the PICO-8 scene, how much of it is about overcoming the (artificial) technical limitations and how much is about using the platform "as intended" and focus on gameplay.


> In fact, the video game industry is one of the few that actually care about performance, often more than safety and correctness.

If it does, then it doesn't show. It feels as though most major game engines have completely given up on performance-as-a-default in favour of simplifying (sometimes marginally) art workflows, and the frame rendering breakdown articles/videos I consume now and then entail baffling, utterly ridiculous and unnecessary drawcall counts, unoptimized geometry, poorly optimized shaders that give marginal graphical improvements, huge overdraw, etc.

It's gotten bad enough that resolution upscaling is becoming commonplace because of how ridiculous the per-fragment rendering costs are getting. That has no right to happen in games without high-fidelity subsurface scattering, volumetric effects, indirect lighting, raytraced reflections, etc. And yet it does. This is something nobody resorted to even in the 90s.

I recommend videos from the YT channel Threat Interactive, where they break down a lot of the horrifying rendering performance regressions that have become the norm in recent times.


Optimization happens elsewhere these days. Modern GPUs easily chug tons of triangles as long as they are not smaller than a single pixel. That's why stuff like Nanite is possible, where you auto-compute LOD geometry to roughly pixel-level polygon resolution and it will actually be faster than traditional LODs if you have tons of meshes in your view frustum. If you want to see some really fancy modern optimizations, look for example at how Guerilla implemented 3d volumetric clouds in Horizon. Using some clever tricks, they got the render cost down from 20ms to 2ms, thus making it possible to run on a base PS4.

As far as I can tell, Nanite is only faster when you're rendering huge trianglecount, unoptimized meshes, and strictly slower than proper LOD - I've seen convincing evidence that just enabling Nanite for already-optimized meshes results in big performance losses in and of itself. If anything, it's the prototypical example of a "performance regression for the sake of streamlining artist workflows" feature.

And honestly, I'm not very impressed by volumetric clouds when games like Assassin's Creed Unity from 2014 had enormous, sprawling open worlds with high-quality visuals comparable to much more recent games that struggle to run on hardware with several times the juice.

Very often, modern games look straight up worse for much higher resource cost because they stop looking better once you're forced to drop from maximum settings(and you are). There's blurry AA techniques, resolution upscaling(ew), or the fact that lowering texture resolution often degrades graphics way, way harder than in older titles using similar texture fidelity.


I would say a little of both. Some of the most important bottlenecks a new game dev will encounter is "number of colors, screen resolution, sound properties" etc.

The CPU speed is also limited but it's a different kind of challenge to try to make a game be so complicated that there's enough math to even do to challenge the available CPU allocation given how few permutations of screen space are available to try to alter.

Depending on your game, RAM limitation and program space limitation might also come into play.

But the artificial limitations are only one element to why the platform is popular, the other element is the tooling.

Much like using Unity, Unreal, Roblox, etc you have loads of primitives and coding features (and music and SFX creation suites) at your fingertips built right into the platform, and how the limitations help here is to cap how subtle and complicated a person's goals are liable to be in order to help refocus the developer's attention back onto the gameplay loop.


The java 4k contest was fun (make a java applet game that is at most a 4096 bytes JAR, including all assets the game needs). Unfortunately I only discovered it shortly before it shut down (when no browsers supported java applets anymore). I did enter one year and it was not by coincidence that was the only time in 40 years of hobby-gamedev that I shared a game I made with people online.

It was a lot of fun to try to squeeze a game into 4 kB but I can definitely also admit that part of it was being able to use the extreme limit as an excuse for making things simpler and not have to worry about the game being ugly or not having sounds or not having a title screen and all the other 90% of stuff that would have been expected to turn it into a complete game.

I play around a bit with TIC-80. Some say it is just an open source PICO-8 and I guess it is to some extent. I paid for both but I have not spent as much time with PICO-8. I like that TIC-80 supports quite many different programming languages, but maybe PICO-8 does as well these days? There is no super obvious reason to use one or the other, so I go with the one with free source code.

Real retro gamedev is tempting, but it is difficult to find a real console that were heavily restricted while also being convenient to develop for. The fantasy consoles cheat a bit in being restricted like some early 1980's hardware while running code like some 2010's game engine. It's like the best of both worlds, but also a bit boring compared to writing something actually limited to what some old console could do.


This is mostly rose colored 1/4" thick welding goggles, I think. There are some mind-bendingly good moments in modern AAA video games. Sure there is some repeat sequelitus but there's original stuff too, and the tech definitely enables it.

I can remember a lot of old really, really, really bad videos games too!


I was using this algorithm to make a 3D Mario Kart for my calculator (which can usually barely handle 2D graphics sometimes) which was pretty fun but I never finished

This was one of the prototypes: https://youtu.be/9Z8Bm8ZmWKI


Oooh, looks like Raycasting. :)

I was using this algorithm to make a 3D Mario Kart for my calculator (which can usually barely handle 2D graphics sometimes) which was pretty fun but I never finished

This was one of the prototypes: https://media.discordapp.net/attachments/953383695908216843/...


That's so cool! Well done.

There is lots of cool innovation happening in game tech still. Look at games like Noita, Dreams and Teardown. Or for an even more recent game, look at Tiny Glade. There’s a recent tech talk about it which is full of wizardry: https://youtu.be/jusWW2pPnA0?si=IE-6W0Z1VCBld0AT

There is also lots of cool tech happening at the more foundational levels in engines and frameworks, for example Epic’s Nanite/Lumen/MegaLights in UE5.


> I don't understand how games that are essentially asset flips can be so successful year after year.

We had plenty of shovelware like that ever since the Atari 2600 era too. People point to that (capped off by pacman and ET) as the cause of the console market crash of '83.


If anything, the amount of shovelware sharply decreased over time. In the Atari 2600 and NES days there were hardly any sources which reported on the quality of games. You basically had to decide based on box art or word of mouth. The result were cheap movie tie-ins or the like.

Later there were a lot of specialized game review magazines, though it wasn't easy to get hold of a review for a non-current game if you didn't have past magazine issues laying around. Then there was the Internet with increasingly many free reviews, especially once online advertising took off and game magazines moved to the Web, and finally we got accessible average user ratings from Steam or Metacritic.

If a game is bad today, everyone will know it. There are no bad movie tie-ins anymore. Games are at most bland, similar to an action movie sequel, or exploitative, by using addicting game mechanics. But they are rarely bad in the old sense. I've read that the new Indiana Jones game is better than most of the Indy movies.


> If anything, the amount of shovelware sharply decreased over time.

Seriously? Have you seen the Nintendo and Steam stores? I mean, it's great that game development is so accessible now, but the amount of low quality and half-baked or abandoned Early Access games is staggering.

> Then there was the Internet with increasingly many free reviews, especially once online advertising took off and game magazines moved to the Web, and finally we got accessible average user ratings from Steam or Metacritic.

Reviews don't mean much when publishers enforce embargos, or when they invest in marketing to create hype and drive preorders, only to deliver a broken game at launch with promises to patch it over time. Or when they're gamed by review bombing one way or the other, depending on some internet drama. Or when review sites are given different weights in calculating the overall score, leading to alternatives like OpenCritic. So, yeah, reviews can be helpful, but they're far from reliable.

> If a game is bad today, everyone will know it.

Ehh, highly doubtful, as I mentioned above. And even if there are negative reviews, companies have many different ways of hiding how bad a game is, and still profiting from them despite of it. They can launch a broken/unfinished game, patch it over time, and still end up with good will from consumers because of the work they put into it. See No Man's Sky, Cyberpunk 2077, etc. This way they profit from preorders, take as much time as they want to deliver the experience they advertised (while continuing to sell a half-baked product), and ultimately end up with an unscathed reputation so they can do it again. Then there are outright scams like The Day Before, countless asset flips, lazy ports, etc. Mobile gaming is infested with this garbage.

However you define "bad game", these products are certainly hostile to the consumer and there should be regulation in place to prevent them. This situation is far worse than the one that led to the 1983 crash, yet gaming has never been as popular as it is today.


I never bought that excuse for walled-garden consoles, since home computer games continued to do just fine after 1983 without locking down platforms.

In my game we intentionally constrained ourselves to a low resolution, fixed colour palette, etc. It means we can't fit much text for example and have to keep things as lean as possible. We even simulates a CRT monitor :)

You can check it out here: https://store.steampowered.com/app/3133060/Gnomes/ ;)


That magic is still there, it's just overshadowed by capital. It's hard to find magic in a sea of marketing. Try Noita, it's quite fun.

I enjoy Noita, and interesting games like it, but I wouldn't describe it as "magical". Indies certainly are more enjoyable to play, but very few evoke that same feeling of awe like when you first experienced Doom or Mario 64 in the 90s. In a way I pity newer generations that have grown up in a digitally overstimulated world and missed the era when technology truly felt magical.

The only game that triggered this feeling for me in the past ~decade was Breath of the Wild. Many other games are fun and enjoyable, but very few have that same magic. Whereas in the 90s this was much more common.

I do concede that growing old could be part of the reason for losing that feeling, but I also think that the oversaturated market and lack of technical constraints play a role.


"when availability of hardware resources was limited"

If you aim for the broad mobile market - then you will have some serious limitations, as the majority of humans do not have flagship smartphones.

And even AAA games for the PC market - the better your game runs, the bigger the audience you can reach. Like the other poster said, bigger textures are far from the only thing that is happening behind the scenes.

In other words, I wish I would have unlimited hardware ressources.


> the majority of humans do not have flagship smartphones.

I always wonder if aliens are interacting with us through the anonymity of the internet. Things like this give me a (facetious) hope that they are. We can take from this that all aliens have flagship smartphones, but the majority of humans don’t.


The joke being flagship as in alien flagship? Otherwise I don't follow ..

It was because you referred to “humans”, as if distinction was required from a larger group. Nothing more than that, sorry for the confusion.

I see .. might have been because english is not my first language, but I thought the way I wrote was clear and common?

It is very clear. Your English is perfect, but I suppose it is unusual (to me) to see references to “humans”, maybe “people” is more common but there is absolutely nothing wrong with what you wrote.

> I don't understand how games that are essentially asset flips can be so successful year after year.

In my opinion, whether a game is enjoyable or not doesn't really rely on whether or not they were technically challenging to make. I don't know any successful games that are in fact "asset flips", but I do know that a lot of the games that I really enjoy are not technologically "pushing the envelope" at all: caves of qud, cube chaos, kenshi, rift wizard, just to name a few.


> In my opinion, whether a game is enjoyable or not doesn't really rely on whether or not they were technically challenging to make.

I agree, but that's not what I'm saying. I was referring to cookie cutter games released by Ubisoft, the CoD franchise, and most sports games. They essentially use the same formula, sometimes even the same engine, year after year, and just upgrade the assets, and do some minor tweaks. Yet these releases are incredibly popular and people keep buying them every year. Similarly for lazy "remasters" for games released not even a decade ago. Now with AI upscaling it's even easier to just increase the texture resolution and charge half or even full price for it. It's practically a scam.

> I do know that a lot of the games that I really enjoy are not technologically "pushing the envelope" at all: caves of qud, cube chaos, kenshi, rift wizard, just to name a few.

Sure, and I enjoy many of those as well. A game doesn't have to be technically impressive to be enjoyable. But IMO these don't have the "magic"/"wow" factor that was so common back in the 90s. If you played Doom or Mario 64 when they were released, you were _mesmerized_ by what you were seeing. Yes, these were great games, but the technical wizardry was what made them stand out from everything else. The jump from 2D to 3D and hardware accelerated graphics certainly played a role in that, but it was also due to very clever algorithms than ran on the same hardware as other games that weren't as impressive.

Today VR can arguably deliver that "wow" factor, but that's mostly due to the technology rather than the games themselves. Even nearly a decade into the modern VR wave, there are only a handful of games that deliver an experience close to what many games in the 90s did.

I can basically think of only one example of a modern game that delivered that same feeling: Breath of the Wild. We were used to open world games, we were used to Zelda games, but we hadn't seen such design ingenuity and technical polish, especially on a handheld system. Tears of the Kingdom is arguably a better game, but it didn't deliver that same experience for me.


Ah sorry, I misunderstood you! I agree with everything you say in this comment. Although I do wonder how much of that "magic"/"wow" factor was due to me being very young back then, rather than those games actually being good. Looking at you, 1995 Magic Carpet.

I agree that VR tech in general is just "wow" at lot of the time, even if the games are not that great. But I think that "Hello Puppets", "Scanner Sombre" and "Superhot" were all really cool and innovative, even within the VR scene. I think there's still some technological innovations possible that to deliver more "wow" moments.


> Although I do wonder how much of that "magic"/"wow" factor was due to me being very young back then, rather than those games actually being good. Looking at you, 1995 Magic Carpet.

Haha sure, there are also cases where a game is technically impressive but not actually good/enjoyable. The real magic is when both of these come together.

I think it's still too early for VR games to be both (with a few exceptions). We haven't quite solved major problems like locomotion, motion sickness, and comfort, of course. Maybe a few generations from now we'll start to see truly mesmerizing experiences that are both technically brilliant and enjoyable to play.


I've been reading masters of doom and its a nice feeling to relive some of that second hand.

I’ll never forget my disappointment when I discovered that the higher resolution offered by a better GPU meant more display pixels, not more triangles. I thought it would make all the models in Descent 3 smoother. :)

Moonshot[1] is another similar game, and a bit more polished. It was abandoned, unfortunately, but it's fun for a few minutes.

I just noticed there's Orbit Outlaws[2] from the same developer, which builds on the same concept (for better or worse), but is also abandoned.

[1]: https://store.steampowered.com/app/426930/Moonshot/

[2]: https://store.steampowered.com/app/1319100/Orbit_Outlaws/


Unfortunately, nothing will actually change from the inside. This industry is rotten to the core, and companies will continue to exploit users and other companies as long as they can profit from it. It's not like PayPal Honey was some obscure company with no visibility. PayPal knew damn well what they were buying and how the company operates.

The only way this could change is if the tech industry is hit with strict regulations. But considering that governments are technically incompetent, and that they're either in symbiosis or plain bought out by Big Tech, this has no chance of happening. Especially in the US, where any mention of regulation is met with criticism even from consumers, and where Musk will be taking the reigns for the next 4 years.

Once this "scandal" blows over and consumers forget about it, PayPal Honey will either continue to exist, or will rebrand as a different company in the same industry, operating the same way it does now.

As for influencers: it's hilarious that you think any positive change could come from them. They only care about getting paid, and could promote anything that lands in their inbox. Hell, they're often the ones who scam their own audience. We're decades away from regulating that whole mess.


This is a forum run by a Silicon Valley VC firm, frequented by tech entrepreneurs. Ethical behavior is not high on their list of priorities.

I'm not sure they could even define ethical.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: