Hacker News new | past | comments | ask | show | jobs | submit login
FTC's rule banning fake online reviews goes into effect (go.com)
613 points by indus 22 days ago | hide | past | favorite | 342 comments



Does the regulation say anything about deceptively moderating reviews? e.g. deleting all the low star reviews?

edit: it doesn't seem so. You just have use some weasel language:

>The final rule also bars a business from misrepresenting that the reviews on a review portion of its website represent all or most of the reviews submitted when reviews have been suppressed based upon their ratings or negative sentiment.

https://www.ftc.gov/news-events/news/press-releases/2024/08/...


How does this stop one of the most common practices?

* Step 1, take a product with a terrible rating

* Step 2, create a new SKU for the exact same product so it has no ratings

* Step 3, get a handful of fake 5-star reviews (in some way the FTC isn't going to crack down on)

* Step 4, blast the old terribly reviewed product that now has good reviews on marketing

* Step 5, get 10s of thousands of sales, $$$

* Step 6, let the terrible reviews pour in

Repeat to step 1 (possibly under a different brand name).


This is an important thing to tackle too. Amazon is notorious for allowing shady practices like Sell product A for lots of 5* reviews, then change the product listing to a completely different thing (which may or may not deserve 5) ...

Another aspect is review solicitation. eg: ios games often pop up with their own modal of "Rate us" and if you click 5 it redirects you to app store to make a review, if you click 4 or less it redirects you to a feedback form. They grease the path for positive reviewers.


If an app pops up a "rate us" modal, it gets a 1-star in the app store, with a note to the developer why. I don't care how great your app is.


As an indie app developer this makes me really sad. We need reviews otherwise we won’t get enough downloads. Big companies can pay huge amounts on ads, we can’t and thus rely on positive reviews and ratings. Fact is that most users won’t rate unless asked.

If you really like an app give it a nice review.


While I appreciate that need, as a user this is the worst way to get me to review your app. Especially because so many of them aren't tuned for paying any attention at all to what their users are doing before prompting them. I had one app recently prompt me for a review before I'd even completed their "first time tutorial" slide deck. Not only do I not know enough at that point in time to even review the app, but if I was so inclined to click through at that moment it would have been to leave a review complaining about the practice rather than saying anything substantive about the app's functionality. But even when they're not that bad, they're almost always popping up when I open the app (the moment when I'm specifically intending to do something that I'm now being interrupted) or in the middle of some workflow. It's the same annoying behavior that web pop-up folks used to do too.

Personally, I'd rather see you add a small UI element somewhere, or a banner that appears briefly but critically doesn't cover up any controls. If you absolutely MUST use a pop up, you know when the best time to do that is? After I've completed some in app purchase. If I'm spending money on your product, chances are I'm moderately satisfied with it and feeling pretty good about it at that moment. Or if you don't have in app purchases, unless you've made a "content browsing only" app, you probably have some workflows that have a definite end state. Prompt me then, at the end of me doing what I've come to your app to do. But I've never once given a review / stars to any app that has interrupted me in the middle of or at the start of doing something.


The nRF connect app (bluetooth debugging tool, mostly) asks for reviews at the bottom of the changelog for app updates, along with an explanation for why they don't do it inside the app itself. Very glad they handle it that way, and I don't think I've seen any other app that does.


That is a nice way to avoid a dark UX pattern, but very few people actually look at changelogs, so even in that case, a less technical app doing the same thing may be missing a lot of potential reviewers.

Granted, there are plenty of other places to slot a review link in where it could be just as effective.


> Especially because so many of them aren't tuned for paying any attention at all to what their users are doing before prompting them

I had Napper, a baby sleep tracker, ask me to rate the app at 3 am. Yes, I'm definitely going to give a thorough and well-thought review at 3 am.


But that's the thing, if it pops up a plea for ratings (or an ad or anything else unwanted and annoying), then I really, genuinely and honestly DON'T like the app.


Then don’t be annoying about them. I do the exact same thing that guy talked about. Dark patterns get explicit 1 stars


Unfortunately an annoying app will out compete a non-annoying app in terms of reviews. Even if a few people like GP 1-star it, it's still worth it since most will 5 star it.


This is the reality, but it's bad. How do we fix it? An App Store policy banning the practice? Global extensions like in web browsers that can use block lists to enable user to hide annoying elements automatically? De-weight reviews from users whose app install orginated from an ad click rather than organically to level the playing field?


The best way I've found is: stop using apps. If I'm using the phone, in either making a phone call or using Firefox. Apps might "solve some need", but it seems like all of them are more interested in data collection and selling that data to "their partners". We're better off throwing these black mirrors into the ocean.


> The best way I've found is: stop using apps.

This is what I'm doing as well. Apps have increasingly gotten more annoying in more ways - from unnecessary pop-up notifications (increased permission requests, policy updates, review pan-handling, etc), privacy issues, data hoarding and more. I also hate that almost all of the few remaining apps I do use are constantly pushing new versions into the app store, invariably with only a vaguely non-specific unchanging boilerplate sentence as a change log. Yet I never notice any new functionality or capabilities in the app and all-too-often updates only bring more ads, cross-promotion or other general enshittification (like just renaming or regrouping the same functionality in different ways - apparently for no reason other than to increase some internal aggregate 'usage metric' to hit a KPI). Although I don't know this, I assume app store algorithms must somehow (perhaps unintentionally) incentivize developers to constantly update their apps for little or no reason.

So, as a group, the long-term behavior of app developers has taught me to resist updating the few apps I do still have installed.


A way to fix the problem would be for the App Store to ban that practice _and_ itself nag the users for ratings, in the less annoying way; like, asking you to rate a list of apps you have been using a lot when you open the App Store, and also asking you to rate when you delete an app.

That would be a win for everyone.


Not when you open the app store.

The typical app store workflow for me is I visit the store to download a specific app I'd like to install. That app will then have to download and install while I wait.

That "while I wait" is an ideal time to ask me to rate other recently installed apps, or an app I haven't used in a while.


I came here wanting to say the same thing. It's a lot like Amazon emailing customers periodically to review recent purchases and making it really easy to do it. I pretty often do that and it works! It doesn't feel annoying either because it isn't in my way.

The key is catching the user when they aren't completing a specific task. People often check email to pass time, which is perfect for this.


Yes. Most of the major apps play this review game, and there's no way to compete if you don't play it too.

The major apps typically exploit selection bias to solicit 5-star reviews. They will wait until the user meets some criteria for "having a good experience" and show an app review prompt at that moment.

Then, having amassed thousands of 5-star reviews, they will turn up the threshold so that only a trickle of the most likely 5-star reviews keep on trickling in to negate any negative organic reviews.

There's a related practice of "pre-prompting" where the app first asks the user whether they are satisfied and only solicits a real app review from those who pass the screening question.

It's all quite shady and makes it hard to trust app reviews. But until the app stores solve this, app developers need to play the game.


if you make a good app don't pester me with popups while i'm using it. that's horrible UX, simple as.


we will, of our own accord without nagging.


It’s only a guess, but I don’t think data is on your side. I seriously doubt that appreciative users “will, of their own accord with your nagging” rate apps. I’d bet it’s less than 2% who do


The data for physical sales definitely show that prompting customers for a review increases the amount of positive reviews you get. It’s basically what rating sites like Trustpilot sells, along with a removal of all those “unrelated you bugged me so you get a 1” reviews, because those sites tend to be a little shady.

This is just a guess, but I’m not sure getting an e-mail asking for a rating a few days after a purchase is really as “get out of my face” inducing as the App pop ups. When I open an app to buy a ticket for public transportation, that is usually while I’m actively boarding the train/bus (because why would I do this in a timely manner?). That is the least likely moment I’ll respond well to review requests. I don’t think I’ve ever been tempted to leave a bad review over one of those emails, but I’m very often tempted to do so by app pop ups. If I’m not the only one then maybe the data would be interesting?

That being said. Unless the 1 star ragers spend time on their review it’s typically rather easy to challenge by the app creator at least in the Apple Store.


Yeah the comment reads as originating from a person who has never tried to sell something. You need to ask to get attention


> You need to ask to get attention

That’s fair, but if you push someone to review your app they’re going to rate it as they see fit, based on what’s important to them, not what the developer thinks is important. If the user feels strongly about a particular element - such as a pop up asking for a rating - they’re going to rate it based on that element. A developer is always free to change the app if they think it’s useful to appeal to that group of users, or ignore that group of users and accept that they don’t like the way you designed the app.


That’s fine, but don’t be upset when your cry for attention is met with a one star review. You’re putting your own needs before those of your users and we all understand why.


Can’t keep working to improve an app if nobody downloads it because nobody knows it exists because nobody ever leaves reviews because nobody reminded the user that reviews are important.

I also don’t like review popups but, excepting egregious examples, I try to be patient because it is beyond most developers’ control that they have to do this in order to maintain a standing in the marketplace.


Or as a knee-jerk reply from a kid when asked to do their homework or take out the garbage. :J


I read it as the tone of someone sick of advertising. I understand you need to sell your product (that I may even like) I just don’t give a fuck. When your UI pisses me me off enough, then you get a rating. I cannot stress this enough: It is not your customer’s job to evangelize your product, even if their literal life depends on its continued existence. Entrepreneurs/sales need to get over themselves.

Signed, Someone who worked in sales.


What is there to be sad about? You're getting the feedback you want: Don't beg for things, it's obnoxious and destroys whatever goodwill you have.


If the app is "clean" in terms of respecting users and privacy, I'm totally inclined to rate it when asked. I appreciate the resources required to make them.


You can also ask for reviews nicely without a popup (just show it on some confirmation screen), really no need to bother the user with it.


What makes me really sad is that you've identified a problem with capitalism but decide to push it onto your customers. It's true that most users won't rate unless asked, but that just means they don't want to and it's not your place to “make them”. It's not their fault that big companies exist that can pay huge amounts on ads.


Yeah I understand this and definitely do not retaliate against being asked for reviews. I find the usual modal pop-up for a review can be a bit jarring or appear at inopportune moments though, i wonder if not using modals would be better.


Being annoying and jarring is the point.


If u wanted to be less annoying do it after the user has a "win" on your app, after they use it for something useful or they had a fun interaction depending on the type of app.

Don't just interrupt me randomly before I do the thing I need


Absolutely my practice as well. App devs should never be in the business of nagging for reviews.


I don’t daily drive Android, so I’m not sure if there’s an equivalent, but iOS has a system review nag/prompt that can be disabled globally. If the app lets the system manage it (where it waits a while to see how long you’ve used it before surfacing the prompt, and doesn’t redirect sub-five-star reviews to their own internal tracking), then I’m happy to leave a genuine review. If the app violates any of these rules, I go out of my way to leave a one-star review.

Don’t overrule my preferences in the name of growth hacking.


Only do this for big corporate apps. The little guys are struggling just to keep their heads above the water because Apple punishes them if they aren't getting reviews.


on my phone, I have play store firewalled and only allow it out when I want updates/install something.

if I could be bothered with the effort, this is the kind of petty I would engage in.


I only do that if the app asks me for an internal review first, and then, when I give 5 stars, it asks me to give a review again in the store - then I give 1 star.


iOS: That’s 100% against the rules. Much like other dark patterns like forcing a sign up or location access as gating to the rest of the app. Or using notifications for advertising.

Now if only Apple would enforce those (or stop doing them themselves).


I've thought about starting a page to call out the apps that abuse push notifications for ads to show that Apple isn't enforcing its rule.

> 4.5.4 ... Push Notifications should not be used for promotions or direct marketing purposes unless customers have explicitly opted in to receive them via consent language displayed in your app’s UI, and you provide a method in your app for a user to opt out from receiving such messages. Abuse of these services may result in revocation of your privileges.

The worst offender is DoorDash. If you turn off push ads, after you place an order it will prompt you to turn on notifications "to get the latest on your order". Agreeing turns on ads. You get the prompt even if you already have order update notifications enabled.


Deliveroo as well is incredibly needy about wanting me to let it send me notifications. This has the opposite effect as I assume it'll send me spam notifications constantly.

I have a strict rule that only people are allowed to make my devices notify; apps get notifications disabled by default.


I block every single notif from nearly every single program on my phone. The only real exceptions are my bank and brokerage and games I play everyday; you know, stuff I actually care about.

I haven't lost anything from blocking the rest, and I'm not about to start allowing now.

"Notif" because it's Not a question of If I will allow them, also because it's not worthy of being called by a full and proper name.


I just don’t install apps.


4.5.4 was what I was thinking of when I mentioned Apple violating their own rules.

They’ve gotten pretty bad about it.


I got a “rate this app” popup from some Apple app (why should they even care?). Can’t remember which one, though, so it’s pure anecdata.

I suspect that some dependency popped it up.


I’ve seen that too, confused the hell out of me.


Unenforced rules aren't rules so much as taxes on the honest.


And a potential cudgel with which to strike those who's success is inconvenient.


Also if they're selectively enforced then they are a way to discriminate.


That’s a pretty clever phrase!


oof - the app we work on at my company does all of these..


Well I understand why people don’t like some of them, the truth is the vast majority of the App Store rules are really good as an end user/consumer.

Unfortunately Apple doesn’t seem to care unless the rule is really good for Apple.


Did you just have an “Are we the baddies?” moment?


They probably get way more reviews with the prompt, and positive ones, than without it, despite how some morally indignant outlier HN commenters would react.


Oh they absolutely work. And given that ratings are about the only thing that matters in the App Store besides search ads, there is a huge incentive to push for it no matter how horrible it is for the user.


Does the new product have the same ASIN ?

How could they allow this?


New ASIN. They can take a physically unbranded product and list it under a new name brand at will. They can change the quantity or bundle. They can change an irrelevant attribute. Amazon plays ignorant.

I sell a product there and some of my competitors are doing those things I listed. Their reviews are also very obviously fake. I’ve also received some obviously fake negative reviews. I’m not really holding out any hope that it’ll get better anytime soon.

I just reduced my Amazon advertising spend so I can focus on other channels. Also a little bit out of spite.


> Amazon is notorious for allowing shady practices

Yes! As a heavy, long-time Amazon user I hate this and have to believe Amazon is knowingly complicit either in continuing to enable this shady vendor behavior or conveniently looking the other way. Of course, shady vendors will game whatever measures Amazon might take to prevent such tricks but it's so prevalent I don't think Amazon seriously invests in detection/prevention of 'rating swapping' on an ongoing basis anymore.

Another super annoying thing Amazon enables is allowing sellers to list multiple different products (SKUs) on the same listing. This was originally intended for things like different colors or sizes of the same product but it is frequently abused by vendors to bundle quite different products into one listing and thus sharing one rating.

Before they were the overwhelming market leader, Amazon used to care about and invest in the accuracy and credibility of product reviews and ratings. About 10 years ago they seemed to stop putting as much effort toward this and certainly in the past five years they don't seem to care if vendors subvert the system. I understand bad behavior can never be 100% prevented but Amazon could police and penalize it far more effectively. For example, requiring sellers above a certain volume of sales and listings to have increasingly stringent "real ID" type verification, making it harder (or at least more costly) to just relist under a new identity when caught cheating.


"Amazon is notorious for allowing shady practices"

Surely, 'conspiring to/orchestrating profit through immoral practices' is a more precise statement of Amazon's activities.


*immoral and illegal


Isn’t that against App Store TOS?


Well step 3 is the part they just made illegal. If you are OK with breaking the law, nothing is going to stop you until you get caught and fined. Presumably the getting caught and fined part will be enough deterrent.


There's a difference between "fake as defined by the FTC which you will actually get in trouble for" and "fake".


It's your comment in the context of the FTC. You said it was fake, in the context of the FTC. Why are you debating yourself?


Please re-read. The FTC defining it as fake means nothing if the FTC does not, in practice, crack down on it regularly.

The FTC can say it's illegal to do X, and all companies can do X with impunity if the FTC, in practice, does not do anything about it when companies do X.


what qualifies a review as fake? If I write it, it's a review isn't it? The whole thing is subjective. Plenty of people love products I can't stand


And how do they even audit it? Do they require only users who verifiably used/purchased the product to submit reviews? Do they require the reviewer to actually use the product? for sufficient amount of time so that the review is more than just "first impression"? So many loopholes, this won't change anything except perhaps a few big marketplaces but it's doubtful they will be able to police it


I don't have the most faith it will be easy to execute but I would imagine:

- Some disgruntled people at company's could leak directly, which would make engaging in this behavior riskier

- Random individuals or competing companies could monitor product reviews and report. For example, show that an Amazon product ID used to be for another product 3 months ago when reviews were written.

I'm optimistic. There are a lot of regulations (including digital regulations) that everyone ends up following even if the government isn't monitoring things themselves. The risk of penalty just needs to be high enough, and hopefully places like Amazon realize the downside/penalty of fake reviews now makes it worth policing.

It obviously won't help your "first impression" review problem but that's not the intent of the law and not sure why the government would be involved in that. A lot of movies don't hold up well on a rewatch, too. If you are that particular about buying something that lasts X years then you can seek out dedicated advice blogs/youtube channels.


That's actually a really good point. I can review a can opener in a few minutes. Either it opens the can or it doesn't. How would I ever review something like a Ford F-350? I don't even have a trailer heavy enough to test the towing capacity.


Well, that's a bad example ... The can opener I had for the first 50 years of my life left a dangerous crazy sharp metal edge around the opening which I cut myself on more than once. The Oxo can opener I've had for the last 10 years rolls the edge as it cuts and removes the entire top of the can; what's left is extremely safe, at least by comparison with the old style.

Then again, when I was much younger, I had a backpacking can opener that was useful when hiking in places where sometimes buying canned foods made sense. It was about as large as a very large postage stamp, and crazy good for the size and weight. I wouldn't want to use it at home (much), but it was awesome when I had to carry it around.

So, even for can openers, the story can be complicated.

Also, assuming that the primary purpose of an F350 is towing is ... interesting. Lots and lots of them here in rural NM (as much as anyway, anyway), and they are rarely towing anything.


Not debating the practicality here, but even if you need your truck to do something only once in the entirety of your ownership, it needs to be capable of this all the time. Towing, crawling, etc.


I disagree. I've never had a vehicle that does 100% of whatever I'd want a vehicle to do. At some point we need to make tradeoffs and accept that we'll either have limitations or need to solve some problems in a different way.

Letting something that is 1% of operating hours for a device drive requirements strongly is often a mistake. With some obvious exceptions because e.g. I cannot choose when I am going to engage in maximum braking and defer it to a different vehicle.


They do make trade offs. Just not the same you might make. The F350s are limited on where they can park and are a pain in the ass to drive around a city. Some people tow stuff more frequently than they go into the city though, so it probably is a reasonable trade off to them. Also comes with some other perks like comfort and more beefy off road capabilities. Something that is valuable in rural areas even without towing.

I tow stuff about a dozen times a year and live in a city. I drive a Tahoe because not being able to tow when you want to is a pretty big inconvenience even though I’m a single occupant driver 90% of the time and it’s way bigger than I “need”. Turns out it’s quite comfortable and I just like it, even if I wasn’t towing ever.

I went years of renting vehicles just to tow. It sucks in a lot of ways. No one just wakes up and thinks “I’m going to tow some stuff”. You’re doing it for a reason, there’s probably a high amount of labor involved in that reason, trying to do it all in the rental window or find an appropriate vehicle on the day you need it. Is a challenge. I’ve set rental reservations then it rains so I can’t do the work I needed to. Clear skies tomorrow but have to wait a week for another rental to be available. It’s a hassle.

Another thing I struggle with is my towing needs fluctuate a lot. Earlier this year I was doing a construction project and ended up needing to tow stuff practically every day for 6 weeks. If I tried to do that any other way than owning a capable vehicle, it’d have been logistically challenging. Trying to time vehicle rental with trailer and equipment rentals would have dragged the construction project out to easily triple the time just by adding delay, probably much longer. Not to mention the cost of it all. Which the bigger vehicles do cost more, but they are assets even if depreciating. When you rent it’s pure expense. The rent cs own calc can flip quickly.


Sure. I'm not saying it's completely unreasonable.

Here the person was saying "once in the entirety of your ownership". If it's really once in the vehicle's life, then you really should rent something else when you need this.

I understand renting vehicles to move stuff is a PITA. I've used the hardware store's trucks several times and it adds a lot of anxiety to a project (though I've never had a really tough time with availability).


Ah I think he was making a point about the need being Boolean more so than a literal meaning of once. You said 1% which probably matches up to my usage of the tow feature. All good though, those rentals are definitely the most available but they rarely work for me as I usually need more time. They design it to be highly available for short store-to-home trips.

Occasionally I still rent, sometimes I need a bigger truck than I have due to weight.


I bought a truck for similar reasons (was tired of constantly having to rent/borrow cars to tow or haul/pick up something that doesn't fit in a "normal" car). I got a lot of utility use out of it over the years and I do honestly agree, even though I now almost never have to use it for anything truck-related I'm still very happy with it, it's very comfortable and reliable. I'd buy another one in a heartbeat. The convenience of knowing I can spontaneously throw anything I want in the back without ever thinking or planning about it in the rare cases I do still occasionally have to is just the cherry on top at this point.


I think most truck drivers have a similar story and just continue buying trucks after because they’re so convenient even if the demand is super low for actual truck stuff.

The comfort part is hard to discount too as is the increased visibility* and the fact that people choose a vehicle as a fashion statement.

* yes I know tall trucks are less safe for pedestrians and near distance visibility is reduced. That’s a low frequency occurrence for me, not a lot of pedestrians where I am, and is not something I even consider during purchase. Visibility in traffic and car centric places is so much better.


i am sorry but i do not understand.

if a car advertises that towing is a feature, and that the truck should be dependable in its features (which is literally Ford branding), and then towing only worked.. one time (barring extenuating circumstances) -- it most definitely is a product which failed to deliver.

a lemon, so to speak.


I'm not talking about towing being advertised as a feature. It's that choosing a vehicle based on something you need to do once every 5 years is not a great way to choose a vehicle.

There's not even a single vehicle that I could choose that would meet all of my different use cases for 5 years. It's better to pick something that fits the 95% use case best, and figuring how best to plug the gaps for the other 5% of the time.


> but even if you need your truck to do something only once in the entirety of your ownership

I'd just say rent something for that one off time in its entire ownership. Otherwise, I'd be daily driving a 26' box truck because I moved apartments every few years.

One time I had to ship a few pallets of stuff across the country. I guess I should have just bought a semi-trailer truck as a daily driver.


I can rent a box truck for moving easially enough, and generally I know far enough in advance that I can reserve it.

However I've never found a truck I can rent to two. Sure I can rent trucks, but they come up with a large pile of fine print which says I cannot two. Even those box trucks cannot tow, or can tow but only their trailer which has specific restrictions on what you can use it for. Oh, and the trailer they allow you to use has surge brakes which are terrible.


I've rented trucks to tow a few times over the years. Enterprise truck rental has trucks for towing, just a weight restriction.

But to be honest the vast majority of times I've needed to rent a truck to tow something it's because I was renting something towable. I can't imagine I'd bother renting some equipment from one place just to rent a truck from someplace else.

In fact, it's not like one needs some giant truck to tow many things. The vehicle I've owned that had the most use out of its tow hitch was a Ford Focus. I've gotten a bit of use from my midsize crossover which has 5,000lbs of tow capacity. More than enough for a small boat or jet skis or a small trailer.


You’re making a lot of assumptions based on your reality. I usually tow heavy stuff. I max out my half ton truck limit frequently and even have to rent a 3/4 ton or 1 ton. That’s f150/1500, f250/2500, f350/3500.

It might only be a few times a year, I need to move or rent some heavy equipment (excavators and skid steers and lifts mostly). Sometimes I tow a trailer that when empty would exceed your vehicle’s limit. UTVs is a huge hobby in the US and they weight about 1500 lbs each, usually tow 3 of them and trailer is 2500-3000 itself.

My folk live in a rural area and do this stuff weekly. Yet, when you/the GP above (complaining about all the F350s with not trailers) see them, they’re likely not doing that but they came into town for something. You’re sampling is off because you never go where they are when they use those features the most.


No, my sampling is knowing people who have trucks and yet acknowledge they never tow anything. My sampling is having someone tell me they needed a giant truck because they had a third child and need the interior space compared to their old compact sedan, a truck they use to commute to their job selling insurance. My sampling is seeing rows of giant lifted trucks in an urban apartment parking garage night after night for years without ever seeing a lick of dirt on them. I'm sure they're just constantly out towing excavators to their downtown urban apartment.

They didn't just come to town or something, they live there. They work there.

Your sampling is off because you never go where there's crowds of people who absolutely just have a truck as something to commute from their urban apartment to their office job. You never bother seeing the urban cowboys going to their finance jobs.


You have sampling bias. I've known a lot of people who live in apartments who use their truck for truck things often. Many construction workers live in an apartment. Many of them get out to the country on weekends...

A vehicle is expensive. A second vehicle is a lot more expensive. Renting a truck is expensive. If you need a truck just 5% of the time it is overall cheaper to just drive a truck for everything than to have two vehicles or try to rent.


You have sampling bias. I've known a lot of people who never use their trucks for truck things. "But sometimes I put things in the bed!", acting like there's no way to fit a bicycle or a tent in a hatchback. Many office workers live in an apartment and own a truck. Many of them think they'll end up pulling a boat or a camper sometime (they never will), but in the end still just go to the same bars and clubs and other things in the city.

Spend some time in urban Texas and see tons of people who LARP as a cowboy while commuting from their zero-lot line house to their office job. They'll tell you they need a truck, but probably won't be able to point to a single time other than moving a couch that one time a couple years ago where it was actually necessary.

A vehicle is expensive. But tons of people don't pay attention to their costs. They'll drive around town at 13mpg and spend thousands a year more on fuel, tires, maintenance, and more while never really using the capacity of the vehicle they massively overbought because "it's comfortable". What percentage of people would you realistically expect to know how much they spent on fuel and maintenance on their car over the last two years? How many would have any idea how much that could be cut with a smaller car?

I'm not denying people in rural areas probably have a far higher likelihood of actually using trucks as trucks. I'm pointing to all the people in places like Plano who act like a giant truck is an essential thing to own.

And it's hilarious so many construction workers think they personally need a truck for their job. Some of these are those people I personally know who think they need a truck. They're usually not using their personal trucks to actually do any construction work. Most would be able to go to their jobs and back home in a Civic. When they're at the job site they're using the company trucks to actually do the work. A friend of mine "needed" a truck for his home construction business, a business he owns, but in the end never actually uses that truck as a truck. He drives it to the job site, gets out, hops in his International, and uses that to actually haul stuff.

For any of his employees, why would they even want to just donate their personal truck to someone else's company's use? Probably the most expensive thing they own, and they're just going to put the most demanding and high likelihood of damaging activities on it for their employer's benefit. Nope, instead they often show up in beater Camrys and what not.

And like I said earlier, another friend said he needed a truck because he needed a vehicle that could seat 5. That was the reason. Sure, need a truck for that.

I get so many people on places like HN trying to tell me these people just don't exist or are somehow very rare. And yet most people I personally know who drive massive body-on-frame SUVs and pickups are these kinds of people. Few people I know who own trucks actually use their trucks as trucks. Only a few actually do things like haul salvage engine blocks and transmissions (something where you really kind of do want a bed to crane it in and out) or actually routinely tow something.


> Yet, when you/the GP above (complaining about all the F350s with not trailers) see them, they’re likely not doing that but they came into town for something.

I don't live in town, I don't work in town. I ride my bike and run and drive in rural New Mexico.


I just checked, I drive my truck just a few times per month, and it is still cheaper to keep it (paying taxes and maintenance) than to rent the correct vehicle for my rare needs to drive. Renting is expensive, but I did discover enterprise truck rental which I didn't know of before isn't too far from my house. Plus by owning my own truck I have it when I want to go.

Of course I ride my bike most places. There are a few trips every month I make not in bike range though. If like most people I drove a car to work, then a tiny compact car and rent a truck when needed would make sense. But for me I drive to little that a large truck that does anything is cheapest (I've had the truck for 15 years)


The only trailers I can find for rents have surge brakes (or not brakes at all - and thus too light duty for what I want to haul). I'll keep my trailer with electric brakes just to avoid those.


I see a ton of 5 star reviews that just say something like "Super fast shipping!" and think, "OK, have you even opened the box? does it work? is this review for FedEx?"


Well, that's a review for fulfillment, which may or may not be done by the same entity responsible for the product. Many review forms aren't clear about whether they are asking you to review the specific product or the entire transaction experience.


I don't know this seems to be fairly broad statement that could allow enforcement for any number of schemes:

> The final rule addresses reviews and testimonials that misrepresent that they are by someone who does not exist, such as AI-generated fake reviews, or who did not have actual experience with the business or its products or services, or that misrepresent the experience of the person giving it.


> that misrepresent the experience of the person giving it.

I guess the FTC will be handing out fines for improper personal experiences now?


It's about the actions of the business, not the person experiencing the product. If John Doe submits a review that says, "I bought and used this product, and it sucks," a business can't edit that to say, "I bought and used this product, and it was amazing." That would misrepresent the reviewer's personal experience.


But what if that person has no actual experience with the product? Or has insufficient product experience to write a review?


Some will be obvious, such as a review for a book or game or other media item that hasn't been publicly released. I would expect a platform such as Amazon would have responsibility to suppress reviews for items that are not, and have never been for sale. A flood of reviews all coming in immediately after the product goes on sale, or a statistically improbable distribution of geographic locations would also be suspicious.


Amazon has a program (Amazon Vine Voices) that enables sellers and brands to send select customers pre-release items for review before the product goes on sale. Sellers/brands can also do that outside their sales platform. Note that these are almost always "free products in exchange for review" and are supposed to be appropriately disclosed.


Amazon is loaded with LLM generated reviews now. They stand out as overly wordy and rambling while being light on any critical discussion of the product.


To all commenters quickly pointing out the ways this rule is far from perfect: you are completely right. This being clarified, is the alternative doing nothing? Because that's where we are.


Rules degenerating into infinite whack-a-mole is a strong (though inconclusive) signal a mistake is being made. "Let's ban rent increases". "Whoops, now all the landlords are slacking on property maintenance; let's mandate maintenance." "Whoops, now all the landlords have stopped making improvements; let's let them increase rents X% when they spend at least $Y on improvements." "Whoops,..."

So you end up in some new equilibrium. Maybe that equilibrium is better, maybe it's worse, but it's simply not true that it's always better to do something rather than nothing, and pointing out the loopholes in the rules is valid criticism.


Well, I think where we are is having to prove its fraudulent. Agreed, impractically difficult.


When the FTC says "we're cracking down on online reviews" with things like this the average Joe gains more confidence in them, so yes, the doing nothing approach is actually better IMO.


So never do anything unless you can guarantee a particular outcome?


That’s a stretch. But things like this only create a false illusion of safety/honesty which can actually be a tailwind for dishonesty.


My assessment is more that the average consumer won't have any idea that the FTC is doing this, so I am not real worried about the downsides.


Not initially, but in time they tend to hear about it. Some shops are bound to brag that their reviews are FTC compliant and unbiased, etc.


So, don’t do anything at all because there will always be an issue with anything you do? Being negative is a weakness.


How about; do things that you can enforce and expect a positive net impact from, do things in a way that will address the dozens of obvious first impression questions that came up here due to lack of specifics. If you’re going to do it, put some thought into its execution and administration.

And most of all, don’t make global generalizations on commentary that is quite specific and on a very particular topic.


They have though. This has been a 2 year process.

https://www.ftc.gov/news-events/news/press-releases/2022/10/...

https://www.ftc.gov/news-events/news/press-releases/2023/06/...

They probably came to different conclusions as you. And I'm sure they have reasons why they left some of that stuff on the original list out. Because they spent 2 years looking at this rather than going with their "obvious first impression questions".

You'll also note from those links that they have already been pursuing some companies over this stuff. So they're probably aware of what they're up against.


Rating averaging methods _should_ treat scores with fewer data points as less trustworthy and either suppress showing the score or apply some early-rating bias. I.e. if users are sorting by rating new products should never be near the top.

Otherwise it should be possible to sort products or even brands/sellers by age and prefer older ones with more reviews.

I'm not sure Amazon does the first though ATM, and it definitely doesn't do the latter.


I do not think the collective rating on Amazon is a blind average of the individual reviews and ratings. So what you say is probably already being done along with use of other similar signals.


This doesn't help when every useless chinese widget on Amazon with a RNG created brand name has literally thousands or even tens of thousands of fake reviews. Yeah like 10,000+ were so enamored with this {insert useless item here} that they felt compelled to leave a 5 star review. Amazon has totally sold out like eBay. I don't shop on either anymore because it's hard to find real brands and feedback and reviews are fake. Not to mention the blatant fakes of major products ...


Unfortunately some of the weird things I need I can't figure out who else sells them. I can search amazon or ebay and find someone but they don't have a presence elsewhere (at least not that I can find)


Not sure why this was flagged - it echoes my experience pretty accurately!


Im curious about the opposite practice, sharing reviews across several SKUs. I basically stopped looking at reviews because they were unrelated to the one I was buying.

I get that some products have configurations, like color and size, but often times wildly different products are grouped together.


On Amazon you can filter by the current configuration on the review page (at least on desktop).


I do this all the time because so many sellers bundle disparate products under one listing and rating. It's annoying that Amazon buries the option in the "See More Reviews" page which is only linked near the bottom of a product page.


On mobile they make it pretty hard to read reviews (or maybe im in some sort of A/B test where I'm only allowed to ask their LLM about what the reviews say?)


Case in point: candle scents.


It's in the rules. Emphasis mine:

Fake or False Consumer Reviews, Consumer Testimonials, and Celebrity Testimonials: The final rule addresses reviews and testimonials that misrepresent that they are by someone who does not exist, such as AI-generated fake reviews, or who did not have actual experience with the business or its products or services

If you covertly switch the product, then the reviews shown are from people who did not have actual experience with the product.


Something similar to this happens on eBay. Sellers will sell a product say a usb adapter, cheap and fully functional, users leave reviews and then the seller changes the listing to be a completely different item, retaining all the previous ratings and sale counts. How would this apply here is a good question.

Wouldn't like to assume but regulatory bodies usually think about these things in advance no ?


Haven't ebay reviews always meant to be about the seller and not necessarily the product? Ebay started with the expectation it was normal people auctioning used goods. Having reviews for a specific product doesn't make sense when there is no fixed product. Obviously things have changed over the years but the site is still largely built around those assumptions.


Yeah so when you view a listing now from a business it will show "100 units sold" but you're right it's crazy you can just change the whole product. I think it's specific for the business sellers.


https://www.ftc.gov/news-events/news/press-releases/2023/06/...

They proposed including review hijacking. There's probably a reason why they didn't include it. Or maybe they think the rules they included already cover it.


In general, this generation of application developers (not just mobile) never learned one of the most basic rules of UX, that goes back to the 90s - don't bug the user with unnecessary information and absolutely avoid using modal popups unless it's really important (to the user, not you). Those responsible for the business decision to add newsletter subscription and 'try our mobile app' popups in addition to app review ones need to be strung up with piano wire.


Airbnb's business model in a nutshell :)


I mean, step 3 would be illegal... your question is impossible to answer, since you hand waive the illegal step as saying "they do it in such a way that the FTC isn't going to crack down on".

This is basically the equivalent of saying "How are you going to stop crime X if they commit crime X in a way that let's them get away with X?"

Either they find a way to enforce the rules against step 3, or they fail to do so. We can't know yet.


The online shoppers, that I know, have learned to pass on products with a few high reviews, in a highly competitive space. If the signal weak, it's not something to trust.


Shop other places besides Amazon. They've enabled all of this to increase profits.


>How does this stop one of the most common practices?

It doesn't, as long as the US keeps operating on only the letter of the law. It's obvious you're trying to work around a law that might be incomplete. Everyone involves knows they're trying to play around, find a loophole. Everyone knows it _should_ be illegal, but isn't. As long as the US legal system does not punish for blatantly breaking the spirit of the law, you're going to get screwed.


yes, "This final rule, among other things, prohibits [...] certain review suppression practices[...]"

https://www.ftc.gov/legal-library/browse/federal-register-no...

Further down the notice cites the scenario: "[...] more than 4,500 merchants that were automatically publishing only 4- or 5-star consumer reviews"


The actual rule for this part is farther down. Section 465.7b (p161 of the pdf). My reading is basically if the website is showing something that it looks like all of the reviews, then those can't be filtered in this way. But that seems to leave open cherry picking reviews - eg don't imply you're showing them all.

  ... receiving and displaying consumer reviews
  represent most or all the reviews submitted to the website or platform when reviews are being
  suppressed (i.e., not displayable) based upon their ratings or their negative sentiment...


I guess manufacturers will now just have to reject anything other than a five star review immediately. As long as it is not submitted, it cannot be suppressed


FINALLY a use for MangoDB https://github.com/dcramer/mangodb


That lists a complexity of O(1) for all operations. I'm not sure that will scale. I expect my database to implement O(0) or lower complexity.


Will you settle for O(sqrt(1))?


Square root of one is still one.


Come on, regulation can only go so far. The ruling is regarding third party review aggregators, a discussion about self hosted reviews is off-topic.


Amazing all the newfound lawyers in the HN section here pointing out "loopholes" in the rule and then getting corrected by the next commenter.

The FTC continues to do the good, thankless work of making good public policy. I appreciate it.


> The FTC

It seems to me the FTC under Lina Khan. Before that I just don't remember it having so much pro-consumer impact.


Lina Khan is a national treasure.


But did the FTC think about this loophole that I just thought of in three seconds? I am so smart!


Do you expect that a year from now (or two, or however long you think is a fair amount of time to pass), online reviews will be noticeably better/more useful than they are today? I think the underlying thread here is that most people don't expect this to be any more effective than anti-spam or anti-robocalling calling rules.


Probably, yes.

And by the by, I get significantly less spam and phone calls than I used to. Vastly fewer and they're all clearly scams now, which makes it easy to ignore.


I'd be surprised if the next administration doesn't remove the rule.


> Amazing all the newfound lawyers in the HN section here pointing out "loopholes" in the rule and then getting corrected by the next commenter.

Semi-quoting n-gate (RIP in peace): "Hackernews take turns incorrecting one another"


I'm all for consumer protection, but I don't think this is in any way good policy or a good use of time. It's too granular and companies acting in bad faith will obviously continue. That's IF it's enforced and not challenged and thrown out by the fifth circuit or something.


It's almost like they expect the law to be dysfunctional or unevenly applied.


>> > the rule bans reviews and testimonials attributed to people who don’t exist or are generated by artificial intelligence, people who don’t have experience with the business or product/services, or misrepresent their experience.

I guess they don't know about how people scam Amazon reviews by getting legit people to simply buy the product and leave a five star review and then get reimbursed for their purchase later by the company or the company the company hired to get these people to do this.

(From 2022) Inside the Underground Market for Fake Amazon Reviews

https://www.wired.com/story/fake-amazon-reviews-underground-...


Actually that's covered by the rule.

Buying Positive or Negative Reviews: The final rule prohibits businesses from providing compensation or other incentives conditioned on the writing of consumer reviews expressing a particular sentiment, either positive or negative. It clarifies that the conditional nature of the offer of compensation or incentive may be expressly or implicitly conveyed.


I hope this is actively enforced with real teeth very soon. I 1-star fake products and call them out in reviews resulting in the devious vendor somehow being able to send me a postcard to my real physical address offering money for 5 stars. The sham vendor also spam my email weekly. Amazon appears to actively support this process. It needed to be curtailed decades ago.


>> The final rule prohibits businesses from providing compensation or other incentives.

Amazon has had this rule in place for a long time and I still get cards in the boxes of the stuff I buy, "Give us a 5 star review and get 30% off your next purchase!"

Clearly Amazon doesn't know about this or isn't generally enforcing it. I'm wondering how the FTC is going to patrol this since Amazon has already had this rule in place for a while and it hasn't dissuaded sellers from changing their habits.


> Clearly Amazon doesn't know about this or…

Given that hundreds of people reading this thread have experienced exactly what you’re talking about, I think it’s impossible that Amazon doesn’t know anything about it.


The FTC can force Amazon to do more about it. Just proving they are trying would be a big help.


Amazon is currently providing a LLM-generated summary of these faked customer reviews. To abide by the FTC ruling, Amazon would now have to prove that all of their training data is legitimate customer reviews. Do you think they will actually do that?


If the FTC wants to they can. The government as a lot more power than Amazon, the only question is will they use it.


It seems the gift of free AWS cloud services reconciles all harm Amazon continues to do against customers and employees alike. The government will need to locate its backbone.


The government is not a single entity. those investigating this type of thing are rewarded for success, and are not in any way related to those who would use services.

(as pointed out, it is also illegal for AWS to do that)


That conspiracy theory needs work. The federal government pays billions annually for cloud services and it’s “people go to jail” illegal for the government to accept free services which would otherwise cost money (i.e. the government can use the AWS free tier like everyone else but above that they’re paying like everyone else).


It's not a conspiracy theory. It's business as usual for AWS. I'm all for righteousness but that's not applicable to the US Government and DOJ. People are bought and sold all the time.


Okay, where’s your evidence? Government spending is public so you should be able to say who’s using AWS for free.


The people I've met that leave reviews for free product aren't required to leave any "particular sentiment". They just rely on tacit laws of reciprocity.


I've gotten lots of offers of discounts in exchange for a review.

Not one has ever conditioned it on expressing a certain sentiment, rating, or anything at all.

But I think most people feel strongly enough they should leave a positive review in exchange for money. It doesn't even need to be said.


I think the bigger issue Amazon will face is that you can edit items in a big way... it's not like just clarifying "Multi-socket extension cord" to "Three socket extension cord" but swapping out products wholesale once you've built up a clout of good reviews on it.

Honestly - Amazon really needs some serious lawsuits to force it to stop being such a bad actor in the online retail space.


> Honestly - Amazon really needs some serious lawsuits to force it to stop being such a bad actor in the online retail space.

I think Amazon they would say that they are not a bad actor at all and in fact are providing a meaningful service to consumers and are a major driver of the economy and besides it isn't really a problem because AI[1] yadda yadda yadda.

The truth is:

a. fake reviews make them money, and

b. almost no matter how bad fake reviews get on Amazon, people will continue to dump dopamine into their brains by buying shiny baubles that they might never take out of the box. The "joy" is in purchasing these things, not actually using them.

[1] https://www.aboutamazon.com/news/policy-news-views/how-ai-sp...


This is an extremely hard problem to solve. What degree of change makes it a different product? And that doesn't even touch the problem that products can look identical on the outside and use cheap crap on the inside. Amazon is not a bad actor here. They have every incentive to solve this problem. But they won't, not because they don't try, but because this is a problem as old as commerce.


It's not hard at all, it just needs moderation. Amazon is absolutely the bad actor because they allow sellers to edit their listings to utterly unrelated items, rather than having moderators reject those changes. It's not hard to prevent a cheap kitchen utensil with 2,000 positive reviews from being edited into an expensive drone.

And while moderating things like social media at scale has a lot of challenges, moderating product pages does not. There are orders of magnitude less of them, and they don't need to change that often.


It's a hard problem for a computer to solve - a computer shouldn't be used to solve it... computers were never used to solve it before Amazon because it's clearly a hard problem (and it scales really well with human labor).

Amazon are being a bunch of cheap bastards and skimping on human moderation of product listings - we, as a society, don't need to give them a free pass for trying to make an even more enormous profit. This is only deeply unprofitable to moderate if you have a lot of products listed you're never going to sell any of.


This is 100% the problem.

Suddenly we now have a ton of "new" issues cropping up everywhere. Suddenly being last 20-ish years. These aren't "new". They're just difficult to automate with a computer program, and every company is cheapo now and tries to automate everything with a computer program.

This problem doesn't exist at, say, Walmart. Presumably they physically vet products to at least some degree.


Walmart shuffles parts the other way - the barcode will change every year or so or whatever so they can be sure to clearance out the old.

Walmart’s online store has some similar problems. But you maybe it $5 to lost a product, $10 to change it, problem solved. Now you can hire real humans.


Users need more ability to intelligently contribute. I just hit this yesterday. 2-star review that was actually entirely about the third party that shipped expired stock, not about the product itself. All I could do is flag the review as "other", no text. (As it was the only review I also reported it under the something wrong with the product which does allow text.) And specifically give us "wrong version", "wrong product" and "seller, not product" flags. And don't reject my review that clearly called out that this isn't the real thing. I didn't simply get a counterfeit, the whole listing was counterfeit.

Abuse problems? Give more weight to squawks by people with a lot of purchases and not a lot of what are found to be bogus gripes.


> Amazon is not a bad actor here.

Just to be clear, Amazon is the bad actor here, assisting worse actors. This problem exists because they don’t want to spend money managing vendors, and it’s not a problem for anyone else. I never go to Walmart and discover that the cereal has been replaced with sawdust because even a huge, cost-obsessed retailer will hold their vendor accountable for that.

If the government started enforcing real penalties, each order would have an easy way to report this, they’d actually accept abuse reports when you get a contact from a vendor buying reviews, and they’d start spending money reviewing products themselves like other retailers do.

If you haven’t heard of it, in addition to traditional inventory there’s an entire profession around “mystery shopping” where businesses pay normal people to use their services and report their experiences. They’re explicitly trying to catch dishonest middle managers and suppliers who might do things like pull expired food from the shelves when it’s time for the annual inventory or make a better meal for someone they recognize as a corporate employee, and it’d be very, very easy for Amazon to check sellers the same way and if they actively solicited abuse reports they’d have an easy prioritization mechanism. It’d cut slightly into their profit margins but I doubt Bezos would even have to reduce the number of spacecraft he buys because I’ve heard so many people mention not buying things on Amazon because they’ve been burned by fraud in the past.


"generated by artificial intelligence" ? So if I write "this product sucks" for a review and I use Bing or some other source to rewrite this to "this product's quality does not live up to the manufacturer's claim" based on my input does that make it a crime?


I read it as "attributed to people who ... are generated by artificial intelligence.'

Insurance against the argument that "This person who wrote the review does exist, just not in a flesh body, they're an AI creation." But that might also be an instant-flop argument legally since I'm sure "personhood" has some definition near-future AI can't hope to approach.


Christ almighty you people are exhausting.


Of course they know. One thing at a time.


Especially now that literally anything the FTC does could be struck down by a federal judge at any time, unless it is explicitly written out or delegated legislation.


The press release from FTC containing the entire rule: https://www.ftc.gov/news-events/news/press-releases/2024/08/...


> It also bans businesses from creating or selling reviews or testimonials. Businesses that knowingly buy fake reviews, procure them from company insiders or disseminate fake reviews will be penalized. It also prohibits businesses from using “unfounded or groundless legal threats, physical threats, intimidation, or certain false public accusations.”

Still seems like it leaves in a giant loophole for all of those overly-cheery reviews that start with, "This item was provided to me by the manufacturer in exchange for a fair and honest review!"


You are no longer allowrd to provide compensation for reviews. So companies can still send out stuff for your to possibly reviews but it can't make recieving items dependent on actually writing a review, even 'implicitly', though we'll see how enforcement shakes out.


It will be impossible to enforce. The people who don't leave good reviews simply will get dropped from the mailing list. However, it forces the whole thing to kind of move underground, which should help at least reduce the scale of the problem, and creates a deterrent against getting too aggressive with it.


And if enforced aggressively, will only provide a set up for false flag operations to get a competitor banned for fake reviews. I think we've already seen this movie in SEO....


The evidentiary standards for Google search ranking changes is VERY different than the one used for FTC enforcement actions.

I'm pretty sure getting caught for trying to frame a company for buying reviews would bring criminal charges that are more serious than the FTC enforcement action.


Could coupons be a way around this? ... [deleted]

edit: after RTFM, page 42, coupons are considered valuable:

> For the reasons explained in this section, the Commission is finalizing the definition of “purchase a consumer review” to mean to provide something of value, such as money, gift certificates, products, services, discounts, coupons, contest entries, or another review, in exchange for a consumer review.


Why would a coupon be a way around this?

I think that document is specifically about taxes and coupons. It is not intended to define "compensation" for every statue in California and certainly not for federally issued rules from the FTC.

Even then, that rule is about whether the coupon issuer is compensated when a coupon is used, NOT about if a customer is compensated if they are given a coupon.


Wouldn't this ban a huge swath of you tube reviews? I've watched plenty of youtube videos where some one uses a product and says something like "I had no interest in this thing, but the manufacturer offered it to me for free if I made a video of me using and gave my impressions of it"


Compensation can still be provided as long as it is not conditional on the reviewer expressing a particular sentiment. So your example could still be allowed.

> The final rule prohibits businesses from providing compensation or other incentives conditioned on the writing of consumer reviews expressing a particular sentiment, either positive or negative.

https://www.ftc.gov/news-events/news/press-releases/2024/08/...


You can non-explicitly enforce positive review coverage by simply not sending review units to people who are likely to say things bad about your products. If you send early review units to 10 people one year, and the next year only 6 of them get review units, and the 4 people who didn't get review units this year were the 4 who gave the harshest review, the message is now out that you need to say good things if you want to continue getting early access to devices for reviews.

SnazzyLabs is a good example - he should be well within the criteria for Apple to be sending him iPhones and Macs early, but I can only assume Apple thinks he's too critical when he finds an issue he doesn't like. Thus he has to buy his review units on street release date along with everyone else. How many people are giving less critical reviews because of that?


nVidia tried to pull this stunt with the YouTube channel Hardware Unboxed. They weren't singing the praises of RTX and DLSS loud enough for nVidia and were threatened with having review samples withheld until they changed their tune.


> It clarifies that the conditional nature of the offer of compensation or incentive may be expressly or implicitly conveyed.

So implicity only offering review units to positive reviewers is still not allowed.


So if I style myself as a negative only reviewer, they have to give me a review unit? Like I'm that judge in the olympics that never gives anyone a perfect score. The best your product will get from me is a 2/10.


They would send you all their competitors products to review instead.


I would like that arrangement, as I will have eventually got multiple units of each brand's products by criticising all of them.


I will undercut you by offering only 0/10 reviews on all products


I would guess it would be more lie your solicted reviews should not deviate significantly from you unsolicited reviews. Perhaps there will be a market for "negging" reviewers? Those who criticize but only in the ways that don't impact sales?


OK, then this has been the de-facto standard amongst many industries for a long time. Plenty of reviewers say stuff like "it really is weird! I made one video about how I didn't like a product. After that I was never invited to attend a launch for a product, get early access, etc. I guess those two could not be correlated at all!"

Based on the text you have shared it'd be perfectly fine if you were paid to write a "neutral" review.


The rule also prohibits implicit compensation, but we'll see if it's enforced/enforceable.


I read it as still being allowed to offer compensation for a review, just not compensation for a review with specific content/sentiment/rating.


Sadly, this may mean the end of such literary gems, as "Hell Holds No Surprises For Me Anymore":

https://www.amazon.com/gp/customer-reviews/RZFIYJTPVUZ94?ASI...


Nope, doesn't look like it'll stop you personally from leaving silly reviews.


Ok I'll bite... What am I looking at?


It's a farcical "review" of Haribo Sugar-Free Gummy Bears.

At some point, it was determined that the sweetener (or something) in these, caused ... adverse ... reactions, in some folks.

If you look at all the reviews for that product, you will see many more, but this was the prize-winner.

Actually, I suspect that the ratings went to a different (sugared) product, or that Amazon moved the products around, so the ratings applied to something different.

The sugar-free variety was the problematic one.


Do testimonials count as reviews? Bad news for all the product launches I see on here which are endorsed by 10 unidentifiable "people" with abbreviated surnames and suspiciously stock-photo-ish headshots, if so.


Hoping it should.

When real, testimonials are hand-picked from the full set of reviews and feedback. That practice should be banned too.

Of course, creation of rules would do only so much. The real deal is ability to execute those in practice.


Officially banning fake reviews to introduce liability is a good start, but the real challenge with reviews is the incentive structure.

For positive reviews, a business will figure out customers who they already know had a positive experience (quick delivery, continuous usage, etc) and only send them invites to review. This is perfectly legal and the fundamental business model of many review websites - selling the ability to push invites and “manage” reviews.

For negative reviews - no business wants these, and customers with bad experiences are likely to post them by themselves.

What gets left out is the average experience because reviews are essentially cherry picked from the head and tail ends of the normal curve of experiences. This doesn’t render reviews useless, of course. Having a large number of positive reviews is still a positive signal but it is nowhere close to free from manipulation.


When iOS + apps came out, Apple had a system whereby when an app got uninstalled it prompted the user for a star rating and review. Guess who was doing all the uninstalling? People that hated the apps, and app ratings reflected that.


> the rule bans reviews and testimonials attributed to people who don’t exist or are generated by artificial intelligence, people who don’t have experience with the business or product/services, or misrepresent their experience.

Does the rule apply to private citizens? I wonder if the First Amendment agrees with penalizing private citizens "who don’t have experience with the business or product/services, or misrepresent their experience". They may mean that businesses can't engage people to write such reviews.

Also, how will they handle the scale of enforcement? The large companies seem easy - one enforcement action covers all of Yelp, another all of Amazon, etc. But what about the infinite reviews at smaller vendoers?

Overall though, I think this is great and long past due. The lawlessness of the Internet - fraud, spying, etc. - is absurd.


> Does the rule apply to private citizens? I wonder if the First Amendment agrees with penalizing private citizens "who don’t have experience with the business or product/services, or misrepresent their experience".

I'm sure someone will try to argue that, but the way I interpreted it is that this is not banning people from sharing fake reviews, it's banning businesses from publishing and misrepresenting those reviews as genuine. i.e. It's regulating the business's practices, not the (purported) consumers'.


Effectively, I think it still bans joke reviews. You can submit a joke review, but the company cannot publish it


I think that would depend on whether your joke is clear enough that a reasonable person would not think it “materially misrepresented” your experience – if my review of a Python book says I learned how to import antigravity and was now flying around the neighborhood, it’s probably safe because readers would know that’s impossible. If my joke is too subtle or obscure, it’s probably better not to have it because it will likely confuse people who don’t recognize it.


That seems... fine? I'm not sure I understand your point.


- "Does the rule apply to private citizens? "

The rules do not apply to "reviews that appear on a website or platform as a result of the business merely engaging in consumer review hosting." 16 CFR § 465.2(d)(2) (2024) They apply (paraphrased) to things someone is paying someone else to say. Things people write about products without being paid to write them are uncontroversially First Amendment-protected opinion.

https://www.ftc.gov/legal-library/browse/federal-register-no... (starts on page 153)

- "penalizing private citizens "who don’t have experience with the business or product/services, or misrepresent their experience". They may mean that businesses can't engage people to write such reviews."

I'm not a lawyer, but I think the AP article actually misstated the law. The multiple paragraphs related to this only seem to cover the case where a review "materially misrepresented... that the reviewer used or otherwise had experience with the product". The way the AP paraphrased this is different. They separated out "or misrepresent" with an "or", but it's not separate.


> Does the rule apply to private citizens? I wonder if the First Amendment agrees with penalizing private citizens "who don’t have experience with the business or product/services, or misrepresent their experience"

Maybe I'm wrong but doesn't the first ammended apply to public speech ? Is there some nuances there when a private company is involved and responsible for the content on their platform, in this case reviews? Genuinely never sure of these things for the US.


It does, but this isn’t personal speech but corporate. I can say that buying an iPhone makes me smarter and irresistibly attractive and it’s covered by the first amendment, but it becomes corporate speech once Apple pays me to say that and is thus subject regulation about honesty and disclosure. This is noticeable in ads for things like athletic products – even if it’s a pro athlete they very publicly sponsor, the claims tend to be things like saying they only run in those shoes or some sports drink is part of their training because they don’t want a legal situation because it sounded too much like “Nike said I could run a 4 minute mile if I bought their shoe!”

Review curation is an especially good target for this because the question isn’t the speech but rather whose speech is promoted. Nobody gets in trouble if they accept testimonials and only use positive ones in ads because consumers know those aren’t unbiased but a review page which looks like anyone can post there is making a different promise.


Almost all of the rules include the clause "for a business". The only rules that don't to my eye are basically "no one can make libelous or threatening statements to have a review suppressed or removed" and "no one can sell, distribute, purchase, or procure fake indicators of social media influence [...] for commercial purposes"


Does the first amendment protect financial fraud? Is this strictly a speech issue? Doesn't the first amendment only apply to people in the US? I ask because the shenanigans are world wide.


There's zero first amendment problem.

Because you're free to post as many false reviews on your own personal blog. Nobody is silencing your views.

But a product page is not allowed to publish those views. And businesses have never had first amendment rights to publish falsehoods.

It's no different from ingredient listings on food. There's no first amendment right for a business to lie about the ingredients.


> Does the rule apply to private citizens? I wonder if the First Amendment agrees

> with penalizing private citizens "who don’t have experience with the business or

> product/services, or misrepresent their experience". They may mean that

> businesses can't engage people to write such reviews.

The First Amendment doesn't typically protect your right to commit fraud, no.


I used Downy Super-Gentle Laundry Soap and my clothes fell apart, like it was an acid! Hacker News is secretly controlled by Mark Zuckerberg and Laurene Powell Jobs! ...

Where is the FTC? dang might delete my comment or ban me, but the government has no right to do a thing.


You aren't attempting to defraud anyone.

The intent is clearly to prevent entities from publishing clearly fake/ill-gotten reviews. The first amendment does not protect your speech when that speech is used to assist in committing another crime. The second amendment exists, but that does not give you carte blanche to shoot people (extreme example).

For a speech related example, see the Freeman v Giuliani case[^1], where the defense stated that they "have a first amendment right to lie," which was ruled to not be the case in defamation.

Also remember that there needs to be some measurable level of harm inflicted. A silly comment in this thread is unlikely to have any measurable level of harm, but cheating reviews may result in tens to hundreds of thousands of dollars in sales.

[^1]: APNews https://apnews.com/article/giuliani-2020-election-georgia-de...)


Hopefully this part fixes the Amazon review problem, because it lets them go after Amazon itself...

> It also prohibits them from ... disseminating such testimonials, when the business ... should have known that the reviews or testimonials were fake or false.

Many of the Amazon fake review practices are extremely in the "should have known" category.


Don’t worry, a judge in Texas in the pocket of some big company will shoot this down, just like the attempt to abolish non-competes


[flagged]


How about we get Congress to do their jobs?


Yes, let's make laws by passing laws.


> the SCOTUS judges are the de facto experts of all matters and the regulators

And thankfully so.


Curious why you think these nine individuals are better suited than people with actual expertise?

In the ruling in which they self-ordained these new powers, Gorsuch confused nitrous oxide with nitrogen oxide, five times. Better hope Gorsuch didn't rule on dental anesthetics on your next visit to the dentist.

https://www.scientificamerican.com/article/the-supreme-court...


Especially since one of the main arguments I have heard from folks who approve of the ruling is "but these agency people are un-elected officials!" So are Supreme Court Justices, who unlike agency people have no ethics rules.


Because allowing the "people with actual expertise" to decide for themselves how much power they have over the lives of others is a blatantly obvious conflict of interest?


As opposed to the judges who fly on donor private jets and then take the cases of the said donors. No conflict of interest there. Zero. And, oh yeah, zero enforceable ethics rules.

The disdain for expertise is something I would not expect on HN, but here we are.


>The disdain for expertise is something I would not expect on HN,

I would: HN is full of conservative Americans.


I hope the FTC staffs a large office for enforcement on this. There are surely many hundreds of companies in the business of selling fake reviews, many of them outside the US, and I don't expect much change in the consumer reality of "most reviews are fake" without a great deal of investigatory effort tracing money flows to shut these operations down.


I think we ought to be focusing not on whether it was "real", but on whether it was written by somebody that the user trusts (or maybe there are n trust-hops between reviewer and user). That way users have recourse when they're misled: they can revoke trust in whichever connection exposed them to the misleading review.

Eventually the scammers will be isolated such that they're just paying each other to lie to each other, meanwhile the rest of us can be authentic with each other: we need to learn trust hygiene and bake it into our apps.


I would love a restaurant review site that could tell you “your brother’s friend from college has been here once and liked it”.


I'd definitely use that, though I'd prefer a social graph protocol which restaurant pages just sort of tap into, rather than having a dedicated site. That way we can use the same trust graph for many applications rather than having the restaurant review graph be separate from the scientific peer review graph which is separate from the trusted mechanic review graph etc...


I went to a salon recently and was told I could get 10% off by leaving a 5 star review BEFORE I received any service. That is something I really hate and I wish review sites monitored for that more. Would this be the same thing as buying reviews?


The number of establishments I have seen doing this has skyrocketed. The last 2, I edited my review afterwards to 1 star and saying “this place gives a discount for good reviews”.


I mean, I would just leave.

Because I'm not going to leave a review, don't really leave many reviews. But I'm especially not going to leave a review before I've received service. But if I don't leave a review, I'd be concerned I would be getting deliberately poor service.

And if I'm going to get bad service, why should I subject myself to that?

If anything, I would leave then give a 1 star review saying they give discounts to people who give good reviews beforehand and the explanation I gave above.


And they delete your review. There needs to be a requirement to archive all reviews for seven-ten years. When it was posted, how long it was up, content and user. This is such a rabbit hole.

People cheer when government makes a rule like this but there is a huge costly enforcement mechanism that goes with this. That has to be implemented and maintained. Making the rule is step one, and there are hundreds more steps that have to happen any number of times, forever. Good luck. Making laws that cannot be enforced just increases the cost of government without having the intended effect. I can't think of anything that the prime offenders would like more than that.


It is truly difficult because you do have people who leave fake negative reviews as well. And fake reviews, whether good or bad, should be deleted. They are useless, they are only there to affect review scores.

It's a convoluted problem with no good answer so far.


Yes, that would probably fall under this because 10% off is a form of monetary compensation. But most review sites ban this type of thing, but businesses do it anyway.


If you think that's bad, I've seen doctors do this (albeit after providing service.)


Leave a review, get the service, edit the review later.


Is there something about the American system such that the FTC is more active/aggressive during Democrat office? Anyone else notice this trend? If it's real what causes it?


Of course. The chair of the FTC is a political position so changes with each administration. Very broadly, US Democrats are more in favor of regulation to stop abuses and US Republicans are more ‘hands off, the market will sort it out’ in their approach.


With the FTC in particular, neither party wanted the FTC to be aggressive. Biden wanted to appoint someone else but had all his appointments blocked by the far-right. Lina Khan got in because populists (left and right) saw her as a weapon that could be wielded against Big Tech.

There's actually a big cloud hanging over the Kamala Harris candidacy over whether or not Lina Khan will remain FTC chair. There's a lot of tech money flooding into her campaign. Though in this case it's also to replace the current SEC chair, because the SEC chair is actually enforcing securities law against crypto fraudsters, who would really like to keep their scam going.

Same with Trump. Big Tech banned him for, y'know, instigating a coup d'etat. But three years later, Big Tech is now trying to wine and dine him, because the FTC is scaring the shit out of Big Tech. You have Tim Cook going to Trump and Trump saying how he's going to stop the EU from attacking US companies. Hell, Elon Musk bought Twitter just so he could turn it into an arm of the Trump candidacy. And who knows what Mark Zuckerberg thinks. Likewise, with the SEC stuff, Trump used to be a (rightful) big critic of crypto, until he realized he could make money selling tacky NFTs of himself, and is now also trying to get in on that crypto money.


>Is there something about the American system such that the FTC is more active/aggressive during Democrat office?

Well republicans generally shoot down anything that is pro-consumer at the cost of business profits, even when it's related to consumer awareness or safety, so the only way to get decent pro-consumer rules enacted is when democrats are in power.


[flagged]


These rules and regulations have been in the works for some time, so I don't know where you're getting that.


I wouldn't surprise me if this started while Trump was president, but because of time is just coming out now. It would surprise me if it this started in Obama's first term - but it wouldn't be unreasonable to find out it did.


That's a weird take.


In the US, the heads of these bodies, who'd generally be, by and large, career civil servants in parliamentary democracies, are appointed by the executive (about _4,000_ positions in the US government are presidential appointments!). Different appointees lead to different behaviour.


Some of the large cases that made hackernews in the past year had a similar comment to yours even though the investigation was started under Trump and only completed recently - if you like the democrats though you will give all credit to the democrats.

Which is to say there is some political differences, but don't make such accusations before you carefully check to ensure it isn't just your bias to observe more when democrats are in power and thus see more.


My comment is literally 3 questions and 0 assertions/accusations.


Questions than can easially be read as an accusation.

We don't have enough information to answer them.


[flagged]


Uh, it's more like they spent several years working on it and it happens to be an election year when its published.

> [1] The final rule announced today follows an advance notice of proposed rulemaking and a notice of proposed rulemaking announced in November 2022 and June 2023, respectively. The FTC also held an informal hearing on the proposed rule in February 2024.

[1]: https://www.ftc.gov/news-events/news/press-releases/2024/08/...


This spring our one year old de-humidifier died. The manufacturer would send you a new replacement unit, but first you had to leave a review of the new unit. After the review was submitted, they would send you an Amazon gift card with the replacement value. So the old units that died never get a 1 star, and the new units being "sold" are getting 5 stars.

I guess it is still better than most companies that will find whatever reason they can not to replace faulty equipment.


You can edit reviews on Amazon. So absolutely go through that process, and once you've spent the gift card, edit your review to 1 star and explain why. Because that's disgusting corporate behavior.


Feels like they sort of buried the lede here with this little thing hanging off the back:

> It also prohibits businesses from using “unfounded or groundless legal threats, physical threats, intimidation, or certain false public accusations.

It seems to me like most litigants believe the other case to be groundless. I'm curious how this will look from an enforcement perspective.


Also, what about deepfake images, videos and the tons of AI generated trash flooding YouTube?

Its interesting to see the government stepping to make the industry around fake reviews to be illegal .. possibly the next step in five to ten years is the government saving the INternet from the onslaught of all the AI generated fake crap that's only going to get worse and worse.


Two questions:

Does this apply retroactively? If someone is found to have written fake reviews or paid to get them written in the past, would the rule apply to them too? (I hope so.)

Can the rule be imposed if the culprits are all outside the US? (Even if the E-commerce player is in the US, they may not (or may) have done anything wrong intentionally.)


While making an app, I'm learning what other people in the industry are doing. One piece of "advice" is to put AppStore/play store rating dialog in the onboarding. The case studies show that it indeed improves the reviews by a lot, because people simply rate 5 stars just to get through onboarding.


I can't rely on marketplaces doing a decent job at this - that's why I use Fakespot [0] by Mozilla!

[0]: https://www.fakespot.com/


I've had decent results with fakespot and not buying anything lower than a B rating, with a very few exceptions (not enough reviews for example). I think with a little digilence my Amazon experience has gotten better the past few years since I don't just buy the first cheapest thing that pops up because I'm in a hurry.


What makes you trust that Fakespot and sites like it are themselves not compromised?

I've had similar success by manually avoiding items that look suspicious in any way, and only buying from Amazon directly, or known reputable sellers. This takes _a lot_ of work, and makes shopping on Amazon exhausting, but I don't see how Fakespot would make this any easier. My problem is not so much detecting fake reviews, but filtering out the possible quality items from the sea of junk copycats. It often happens that entire categories of items consist of junk, and you have no other choice but to pick the least worse one. Sometimes you get unlucky, but thankfully Amazon still has good return policies to cover these cases, so it's not a total loss. The amount of returns they process must be insane.

Come to think of it, I'd really like an AI-powered service that acts as an interface to Amazon search, and does this filtering for me. Bonus points for improving search criteria and allowing filtering by country of origin, ranking reputable brands and sellers higher, etc. Since Amazon will never build this, someone please do!


I trust them because they are owned by Mozilla, and I’ve had good results vs my own random judgement of reviews. I’m probably 95% good and I get most of my stuff from Amazon these days. I do trust my gut feeling as much though and have ignored their grade on what seemed an okay purchase.



It all seems quite simple to me. Just require an order number and the date of purchase to write a review and require all reviews be publicly available in a machine readable format and that anyone may publish them.

If you pay me I can write the same using 1000 pages without adding anything useful.


>> Just require an order number ...

This does not solve the problem. Fake reviewers often purchase the product on paper.

Behind the scenes, the seller may not even ship and instead buy fake shipment tracking from the shipping company.

This is much more complex than what you are thinking.


You are right, my suggestion involves the lowest hanging fruit where one can purchase unlimited positive or negative reviews from unrelated parties.

I can google a far away barber who has accumulated 17 reviews over the last 10 years. Google then allows me to shit talk the guy for the world to see. 2 minutes of effort and its free! They get to carry it around like a badge of shame forever.

You should give me a discount or ill post a negative review and unleash the roaches on your hotel. It would be rude if you didn't.

edit: I think a digital ID would be useful for real reviews.


Indeed.

Both sides, product/service providers and consumers, try to game the system. That makes it very hard to figure what is the truth.


This sort of regulation seems oddly granular and not super useful, but not terrible. Companies don't have to offer customer review features. I wonder what would happen if companies decided to remove reviews, rebrand, or switch to another paradigm?


I think the possible changes you mentioned are a desired outcome. The FTC’s consumer protection role tends to be about keeping free markets functional, and under Biden that’s had a lot of focus on removing market distortions. Reviews are tricky that way because people want to trust other buyers but scammers have exploited that trust, and a company removing the reviews section of their site because they don’t want them to be accurate seems like a net win similar to forcing companies to remove false claims. The FTC doesn’t require your product to be good, just honestly sold.


My objection is it's too granular and bad faith operators will only pivot slightly to continue to exploit. It seems ineffective.


Is this law going to penalize Amazon for not doing the very trivial amount of policing to block AI generated reviews on their site? AI review content is still obvious and Amazon must still benefit from it just as much as all their other dark patterns.


Next can we ban 1-star reviews that just complain about the shipping and not the product itself?


Why, if you're thinking of buying the product from the same vendor, would you not reconsider buying from a different vendor just for shipping issues? Shipping is a major part of buying something online, so I don't think it's a bad review to have available


It's a review of the process or the seller, not the product. That having been said, Amazon and most similar online retail "marketplaces" make the seller much less visible than the product, encouraging the reviews to go in the wrong place.


I like how you've reduced the entire internet to just Amazon. It is possible to have reviews on other websites. I know they really try to keep that a secret though.


If UPS put the package by your neighbor, switching vendors doesn't really do anything.

Or if you bought from a third party seller, but your review is attached to an FBA product, the shipping review has nothing to do with the current item.


If the review was that their shipped packages were improperly packaged to survive shipping of any carrier would be useful info. Finding out they took 6 days to arrange for delivery after the order is useful. Bad shipping doesn't just mean the selected carrier had issues


>If UPS put the package by your neighbor, switching vendors doesn't really do anything.

Switching to a seller that ships using fedex or usps would do something. We've had the inverse problem, fedex is the one that always screws up our deliveries and we actively look for sellers that don't use them.


My only 2-star review about my book on Manning's web site is a complaint about not receiving the book. I don't know what madem them give me two stars instead of one. Maybe, they liked the order process. :) If they had reached out to me, I could help them contact with Manning support. But, there's no way I can reach someone from a review (there's only a name listed there, nothing else).


I know it's offtopic, but try reading some recipe reviews...

Chocolate cake recipe, needing a lot of butter, sugar, eggs, etc.

And below it a one star review "I only replaced the eggs with aquafaba, put in just a third of sugar and a third of oil to make it healthier, used cocoa poweder instead of baking chocolate, and it was horrible, hard, clumpy, didn't taste good at all, never making this again, 1 star!"


Marketplaces need a vendor rating and product ratings. When leaving a review, the form could have sections on shipping separate from the product.


How about the App Store reviews on the Amazon app complaining about individual products, or podcast app reviews that are actually about individual podcasts?


So... wasn't this fraud even before and thus covered in some penal code section?


There’s a line between fraud and first they’re trying to navigate. Suppressing speech in the USA is notoriously hard, but can be done if targeted narrowly enough.


Thanks. Now I understand the significance of the ruling better. Love this.

I am in general interested in seeing free speech being replaced by truthful or well-intentioned speech. Fact-checking is hard, and GenAI is making this worse. In my opinion, it is time to reevaluate the notion of free speech.


> Specifically, the rule bans reviews and testimonials attributed to people who don’t exist or are generated by artificial intelligence

This seems a bit redundant, doesn't it?


Love FTC’s momentum in building fair trustable free markets.


Historic date 2024-10-21: the last time anyone lied in a review.

I’m glad they’re trying. It remains to be seen how this’ll sort out.


People can break laws. By your logic I guess we shouldn't have laws then.


> People can break laws. By your logic I guess we shouldn't have laws then.

Let me reverse this on you, how much and what type of punishment and penalty should be levied if laws are broken?

Like this false online reviews ruling, how far should punishment and penalty go?


> how much and what type of punishment and penalty should be levied if laws are broken?

Depends on the impact of the crime the person was convicted of and how likely they are to do it again.

> how far should punishment and penalty go?

For companies who knowingly solicit or publish fake/compensated reviews: disgorgement of profits and refunds without conditions to everyone who asks. Repeated violations come with escalating fines that are a percentage of revenue (not profits) plus bans on company officers holding the position of officer of a publicly traded company for a number of years.


I hope there's enforcement here. I mean just look at this garbage:

https://chromewebstore.google.com/detail/metamask/nkbihfbeog...

I once tried reporting these blatant phishing reviews, but apparently on Google the only way to report a review is if it's "Child sexual abuse material" if it's "policy related" and only "court order" or "intellectual property" if it's legal related. It's a wholy unsatisfactory system riddled with garbage.


Reporting to Google is generally useless. Skim these search results.

https://www.google.com/maps/search/deck+builder+chicago/

The a large percentage of these listings are fake. They're run by a single company who has blanketed Chicago with "lead generation" listings across several local service business industries.

You can report them to Google. Nothing gets done most of the time. Redressal form? Same thing. Escalate on the forums? Same result.

And this is only one such lead gen network that I know of. Google doesn't care.


So The Shed at Dulwich could not exist on TripAdvisor US and Botto Bistro could not be punished by Yelp?


Great, this solves for all the small players. What about big tech. They will just have to ignore this ruling.


Wouldn't this make the glorious reviews for the Hutzler 571 Banana Slicer illegal? I mean this thing has saved and ended marriages, enabled people to live their dreams of starting zydeco bands, started the boomerang pigeon hunting craze, and much more.

https://www.amazon.com/Hutzler-3571-571-Banana-Slicer/produc...


Amazon no longer let's you see reviews that aren't on the main product listing without logging in. You can see the top reviews and the most recent reviews, and that's it; navigating to the page you linked or clicking one of the star ratings (see all 5 star reviews or all 1 star reviews) prompts a login.


Ah, the days when the Internet was fun.


Wow this will affect a lot of product owners I've known


Do you think there will be any impact on sites like HN?


If the product is fake too, does a fake review count?


How do they enforce price this?


Ok, it now banned. I guess my next question is - so what?

Without an enforcement mechanism to monitor the millions of review websites nothing will happen.

And can you imagine the effort required to prove a review is fake?


It's semi-comical that the subtitle contains "AI-Generated". Even gov is on the hype train.


Shouldn't this be the link - not some 5-sentence summary? https://www.ftc.gov/news-events/news/press-releases/2024/08/...

It would make many comments here pointless if commenters started from reading this:

The final rule prohibits:

Fake or False Consumer Reviews, Consumer Testimonials, and Celebrity Testimonials:

The final rule addresses reviews and testimonials that misrepresent that they are by someone who does not exist, such as AI-generated fake reviews, or who did not have actual experience with the business or its products or services, or that misrepresent the experience of the person giving it. It prohibits businesses from creating or selling such reviews or testimonials. It also prohibits them from buying such reviews, procuring them from company insiders, or disseminating such testimonials, when the business knew or should have known that the reviews or testimonials were fake or false.

Buying Positive or Negative Reviews:

The final rule prohibits businesses from providing compensation or other incentives conditioned on the writing of consumer reviews expressing a particular sentiment, either positive or negative. It clarifies that the conditional nature of the offer of compensation or incentive may be expressly or implicitly conveyed.

Insider Reviews and Consumer Testimonials:

The final rule prohibits certain reviews and testimonials written by company insiders that fail to clearly and conspicuously disclose the giver’s material connection to the business. It prohibits such reviews and testimonials given by officers or managers. It also prohibits a business from disseminating such a testimonial that the business should have known was by an officer, manager, employee, or agent. Finally, it imposes requirements when officers or managers solicit consumer reviews from their own immediate relatives or from employees or agents – or when they tell employees or agents to solicit reviews from relatives and such solicitations result in reviews by immediate relatives of the employees or agents.

Company-Controlled Review Websites:

The final rule prohibits a business from misrepresenting that a website or entity it controls provides independent reviews or opinions about a category of products or services that includes its own products or services.

Review Suppression:

The final rule prohibits a business from using unfounded or groundless legal threats, physical threats, intimidation, or certain false public accusations to prevent or remove a negative consumer review. The final rule also bars a business from misrepresenting that the reviews on a review portion of its website represent all or most of the reviews submitted when reviews have been suppressed based upon their ratings or negative sentiment.

Misuse of Fake Social Media Indicators:

The final rule prohibits anyone from selling or buying fake indicators of social media influence, such as followers or views generated by a bot or hijacked account. This prohibition is limited to situations in which the buyer knew or should have known that the indicators were fake and misrepresent the buyer’s influence or importance for a commercial purpose.


Has anyone else gotten a tremendous number of people approaching them online on various platforms seeking people to work as fake review submitters? I always respond with open disgust and admonition, just wondering if it's just me


Good, but HOW?


Amazon must be quaking in their boots.


I'll believe it when its enforced. The spam calls I'm getting are 'banned' to, for all the good it does.


If you don't have the time to thoroughly investigate material non-public information before deciding where to have lunch, are you even a responsible consumer? /s

The normalization of blatant lying in business is really frustrating, both as a businessperson and as a member of the public. We (correctly) consider just making shit up for their own benefit a major strike against a person, but we implicitly tolerate it in the companies that run a good chunk of our lives! Hell, in some cases we even celebrate it: "wow, look how scrappy that person is, what a brilliant marketing ploy!" - no, they're just a liar.


There seems to be a LOT of normalization of straight up lies, made-up-on-the-spot facts, and disinformation the past 6-8 years. Caveat emptor seems to be the SOP rather than a reminder to be skeptical. Of course, that's just my n=1 observation.


Amazonz doooomed


Good intentions by FTC. Unfortunately nearly impossible to enforce. It's almost like FTC banning junk/spam emails. Maybe I'm misunderstanding how this will be enforced and some big players will end up paying large fines. I think Amazon has to get their poop together and fix the comingling product reviews and other ways through their sieve that make this behavior rampant.


How is it impossible to enforce?

Bunch of people report Amazon as being rife with fake reviews. FTC puts together some sort of working group that does some research to figure out if it's true. If it's true, they reach out to Amazon telling them to fix it after handing them a fine. After a while, they verify that Amazon implemented sufficient safe-guards against fake reviews.

Sure, it wouldn't get rid of all fake reviews, but surely it'd be better than the current approach of doing absolutely nothing, no?


How can you enforce people giving fake reviews for things they bought? Bring the review police? How can you prove they're given free products to review them positively? Don't get me wrong, I wish online reviews weren't utterly broken but it seems like business wants it this way. I certainly hope this will get fixed and not jump to the next loophole.


> business wants it this way

Of course they want it. It's purely objective for them and purely deceptive to the consumer. Therefore, it's the perfect thing for the FTC to regulate - I mean this is what their purpose is.

Enforcement will be difficult, but I really think platforms like Amazon isn't the problem. They're a unified platform, it's pretty easy for them to enforce better review. Maybe you need to have actually bought the product, maybe they monitor product descriptions for asking for reviews, maybe they audit packages for those little "review us 5 stars!" slips, maybe they prevent modifying products, etc.

The true tough thing to enforce is little shops. You know, convenience stores, smoke shops, that type. I've been told, verbally, many times that if I review 5 stars, I get some discount. I doubt the FTC will send physical agents to check that.


I hope this will work for fixing Amazon. But how about a million other websites with fake reviews?


It's hard to bug-squash website by website for sure. They scatter like cockroaches in the light.

But, I think most online buying in the US goes through Amazon and maybe a couple other online retailers. Fix it there and you fixed the problem for 90% of cases.


It isn't hard - you can't get everyone, but find a few influences who you "know" are doing this. Then make them aware they are being investigated - even if you don't have enough evidence to convict that they are aware they could be in trouble will make them stop. Or better yet, tell them you are gathering evidence, but if they cooperate with the investigation you will let them off - then they give you a copy of all the illegal communication trying to buy their good reviews: go after the corporations buying illegal reviews.

Remember you don't need to get everyone doing this. Even a few cases that your get on the nightly news will be enough to stop a large majority of fraud. You just need to get enough that everyone else decides not to do this.


It's very easy. Fine Amazon 100 billion dollars for fake reviews. Amazon and everyone else gets rid of the review feature the next day. I'm not sure this is a desirable outcome but it's definitely feasible (just like most blogs got rid of comments).


Unfortunately, this rule excludes most of the fake reviews that plague Amazon.

There are a lot of outfits in Pakistan that recruit reviewers in the US by offering a full refund for Chinese products in exchange for a five star review.

This rule should require disclosure of this behavior and frankly any review that does not originate for a bonafide purchase.


The rule covers this

> Buying Positive or Negative Reviews: The final rule prohibits businesses from providing compensation or other incentives conditioned on the writing of consumer reviews expressing a particular sentiment, either positive or negative. It clarifies that the conditional nature of the offer of compensation or incentive may be expressly or implicitly conveyed.

> Fake or False Consumer Reviews, Consumer Testimonials, and Celebrity Testimonials: The final rule addresses reviews and testimonials that misrepresent that they are by someone who does not exist, such as AI-generated fake reviews, or who did not have actual experience with the business or its products or services, or that misrepresent the experience of the person giving it. It prohibits businesses from creating or selling such reviews or testimonials. It also prohibits them from buying such reviews, procuring them from company insiders, or disseminating such testimonials, when the business knew or should have known that the reviews or testimonials were fake or false.


I recently bought a $9 TV antenna that promised a $50 Amazon gift card for a five-star review.


Where to get this deal?


Seller was ETBRJTK (known for their quality and honesty!)


That is a dirt cheap acquisition cost.


The first thing you have to do is ban USA-based online marketplace companies from hosting foreign vendors. Then you can better regulate what is left.


Or you can hold the marketplace companies responsible for whoever markets on them.

Of course can't hold foreign marketplaces responsible at all - but that is a different loophole we can close (if it becomes a problem Amazon will ensure it is)


Why should a foreign marketplace get the same benefit of the doubt as a domestic market? Especially if those foreign markets belong to countries that are hostile to mine?


the us government cannot directly do anything about foreign markets.


We could easily restrict them from selling in the US


Not easially. Shipping to the US is not hard, and the post office / customs doesn't know what company is sending things. Customs can check of course, but for small value packages they typically do not, so long as the import forms look somewhat reasonable and the import taxes are paid they generally are not looking that close. It is even harder if someone knows they are an illegal company as then they put a false return address on the package and there is no easy way to track them down.

Regular people are not someone custom wants to make an example of for ordering from a banned foreign company. Large companies will carefully check their supply chain to ensure they are not buying goods from banned companies, but tiny companies and regular people won't care as much.

I expect that the EU will have (if they don't already have) rules similar to the US and countries there will generally enforce them. However other countries (China!) will not care as much.


Classic Hacker News to complain about the market abusing the public's trust but when someone suggests a government make a change that benefits their own constituents over other countries it's down voted.


That... would not benefit the government's constituents, tho.


You could easily ban foreign products from being sold on US web platforms. We have done it to a limited degree many times.


The review suppression rule is hilarious. The intent seems to be to prevent people from using asymmetric access to the legal system to bully reviewers into removing reviews they don't like. The remedy? The thing the law was trying to prevent.


I don't understand this cryptic hinting. What's wrong with the rule, and what thing that the law was trying to prevent does the rule mandate as a remedy?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: