Hacker News new | past | comments | ask | show | jobs | submit login
Inside the next Xbox: Project Scorpio tech revealed (eurogamer.net)
158 points by anorborg on April 6, 2017 | hide | past | favorite | 128 comments



What they really need to do is maybe just start completely over on the whole Xbox Dashboard. For something that could be so simple and easy, the whole experience of trying to use the Xbox for anything other than playing a game is insanely frustrating. Every time I sit down to try to use it I'm just tearing my hair out!

It's been, what, over a decade they've been working on that Dashboard and it has always been just a steaming pile of trash.

Everything from initial setup trying to make accounts for the kids, to setting up payment methods, trying to switch between users constantly as things are authorized in one place or another, trying to install apps, constantly turning it on to find its forcing an hour of updates that must be installed to basically unbrick it. The media player app is insultingly bad. The App Store is a wasteland. The UI is a total train wreck.

I went to install Amazon's app to stream some Prime shows last week. You would think probably some other people have tried to do this before, what could go wrong? What a mistake. App constantly crashing, took over an hour, including power cycling the Xbox twice before I get it to the point where I was actually watching a show. This for what is a < 5 minute process on a smartphone.

Every time I try to do anything with that infernal machine, I just end up tearing my hair out. I really don't understand how they have millions of users and so much basic shit Just Doesn't Work.

And don't get me started on the UI..... Everything I want to do buried the absolute maximum amount of clicks away from where I would expect to look for it.

I try not to complain about it usually because it just gets my blood boiling how could MS put out such absolute garbage? Please tell me I'm not alone in feeling this way :-)

The hardware rat race is great and all, but for a platform where they have Apple-esque control I would have expected two orders of magnitude better results than what MS has managed to deliver in 2017 for a living room experience.


I really don't understand how they have millions of users and so much basic shit Just Doesn't Work.

Because those millions of users use the box to play games, like I do. Oh, I've tried all the other stuff, and as you point out it's generally such a steaming pile of shit that I feel foolish for having even tried. (If the Kinect weren't in a box in the garage, I'd be staring in its direction right now.) Frankly, I'd be happy if they could just bring the Xbone back to the level of usability that the 360 had. To their credit, they're getting there little by little in many ways. To their discredit, they shouldn't have to do that in the first place. MSFT had a working platform, but I'm guessing some PMs needed to make their mark, so they "improved" things by breaking them.

So I just play games on it, and watch the occasional Blu-Ray. That keeps the "aggravation footprint" to a minimum. Anything else I want to do is on the Apple TV. And when the next-gen consoles roll out, I'm going to be taking a hard look at what Sony has to offer.


>I'd be happy if they could just bring the Xbone back to the level of usability that the 360 had

My housemates and I purchased an Xbone and realized before long we were using the 360 more because it was vastly preferable for netflix/youtube/etc. Have since switched to a chromecast for that stuff, but the Xbone is absolutely a massive step back in UX. Not to mention the ads on the dashboard. There is advertising on the dashboard of my $250 console, that requires a paid subscription for online play. Why is that acceptable?


Because for some reason you keep paying the subscription, buying the games and playing them?


Ha, fair enough. I don't actually have a Live subscription. The Xbox One is strictly a couch-coop device for me, which made the last Halo release even more appalling.


> but I'm guessing some PMs needed to make their mark, so they "improved" things by breaking them

I loathe this so much and you know this is _exactly_ what happened. This is a problem at a lot of big companies. At the end of the year you don't get reviewed on "improving existing customer experience" you get more money based on how many features you put into production.


See Excel, Google Maps, Chrome, etc. Really anything by Google starts out amazing, and then ends up as either total crap or relegated into a black hole. MSFT has gotten a lot better, relative to before, but is still terrible in absolute numbers. I know why this is, managers gotta manage, mortgages gotta get paid, etc. but the real question is why do companies get into these traps, even though it happens so damn often? The only thing I have is the Gervais Principle, but even that is too simple an explanation.


MS's original reveal event focused a great deal on making the Xbone the hub of the entertainment center. It's why I bought it over the PS4. Only to suffer all the same frustrations as the parent to your comment.

If they had sold this as a gaming console that also does a few entertainment center things, then fine. But they didn't, at least not originally.

And the (almost) worthless Kinect acts as the IR blaster for control of other devices. And mine shuts down intermittently since one of the updates a year ago. That's the only thing I use it for since they changed all the voice commands to 'cortana' from 'xbox'.

And those 30 minute updates to unbrick the device. Inability to update all apps/games at once. Inability to have automatic updates without keeping the thing in "jack up your power bill" mode and "keep the fan in turbo mode".

And the useless review system. Who buys games without reading reviews? Not on the xbone. You get star ratings and that's it. So instead of being a hub, you still need your laptop (or phone) on hand.

I dare anyone who's never tried to setup a child account using only the xbox controller. I'm pretty sure MS has never done any usability testing on that. It's nigh unto impossible.


> Frankly, I'd be happy if they could just bring the Xbone back to the level of usability that the 360 had.

I only own a 360 and was reading the parent comment in context of that and nodding my head vigorously, you mean the UI on the xbone is even worse?

Microsoft is such a schizophrenic company.


I'll give you an example: party chat. Works just great on the 360, right? Send a friend and invite, they accept, you talk. It's been working for ten years, why change it? Well, change it they did. I forget the details, because it was at launch and they've since fixed it, but party chat just didn't work the way it used to, and I'd argue it worked in a broken manner.

But, yeah, the Xbone is worse. Primarily, IMO, worse because you are no longer the customer, AFAICT. I wouldn't be surprised if suddenly a "whack the monkey for a FREE month of XBL!" banner showed up on the dashboard. But there are some serious usability issues in there, too.


Honestly don't have any of the same experience as you except that some things are buried behind too many clicks. I use the amazon stream all the time. I don't even remember installing it. Can't remember ever having a problem installing apps.

I have it set to always on, so I basically never even notice the updates since they happen while I sleep or am at work. So yeah, I don't share your experience at all.


I tried that but it keep the fan running and draws 60+* watts of power constantly (mine is original, don't know if the newer models have lowered this). Mine is in an open entertainment center and I can hear the fan running from the next room.

But either way, why is it so important that I install the update just to use the Amazon Prime app? You can't do anything online until the 30 minute install finishes. I can see blocking online multiplayer games, but video streaming?

* Number may be off a bit, I measured it shortly after it was released. I putt a watt meter on it and left it on for a few days to see if it would ever go into a lower power mode. It didn't.


I agree and I regularly miss the blade dashboard from long ago. Every update keeps making it more and more difficult and intuitive.

When you don't have a mouse or a touch screen, putting things in non-linear places makes extremely difficult to navigate to. I find myself regularly going down and over and up and around. Almost circling things to select them.

Old gaming menus were horizontal or vertical but not both at the same time.


I've got an old 360 with the original blade interface; unfortunately, it's pretty unstable (high chance of red-ringing while playing anything). I pulled it out of storage a couple years ago to get the serial number when MS was offering a free memory card for older machines.

The interface felt so nice to go back to! Things were logically organized! Ads only showed up in the store tab (i.e. when you're looking for things to buy anyhow)! Going back to the current interface on my replacement 360 is what pushed me to start playing with blocking the ads using router rules.


I once tried to use the XBox360 and it wanted me to verify the account before I could update, which took me into a hell that I wouldn't emerge from for 2+ hours. At the end, I was so upset at the BS I had to put up with (including >30 minutes on the phone with their support), I was ready to throw the machine out the window.


> Everything I want to do buried the absolute maximum amount of clicks away from where I would expect to look for it.

Part of the latest update had a video for the new Home button functionality. It used to be that pressing the Xbox button on your controller took you home. In the video they literally said "to go home press the Xbox button and then the "A" button twice".

I can't imagine recording a video telling people to do that. They got it wrong IMO. They should have just had the Xbox button take you home and a long press take you to the menu that you see presently when you press that button.


The new functionality is closer to the original 360 functionality. I preferred it and welcome its reinstatement.


100%.

The UI is insanely difficult to use. I frequently find myself wondering how to do the most basic tasks.

If it weren't for Halo, I'd be gone a long time ago.


such as?


Read reviews for games you're looking at buying? Star ratings is all I've found, but they sold this as an entertainment hub and touted how it's all integrated.

What's the point of all the flashy store interface if it's lacking what has become a critically important part of the buying process for most people?


Hmm. There are mobile apps for iOS, Android and Windows that allow you to browse store. The store is also online via the web and the Xbox includes edge browser so you can search the web for game reviews..

The latest dashboard updates feature a few reviews, but lets be honest - hard problem to solve on console since providing feedback for apps is better done on a tablet/PC/laptop than with a controller.


Totally agree. Although offering "simpler" functionality, the oldschool menu and UI system on the Xbox 360 was LIGHTYEARS ahead of anything the Xbox One has put out. Everything was fast and easy and as expected, from lobby systems, profile viewing, and party chat/party lobbies. I feel like the One was a major step back, UX wise, from the 360


Working on the 2010-2012 dashboard was one of my favorite parts of my career. Back then the dashboard/ system OS had a hard limit of 64 MB - and all the clever optimizations that go into that memory profile.

My favorite day was when our local builds started spitting an "Out of memory" error, and quite literally everyone on the shell+platform team was trying to debug what that meant - turns out it meant our builds were actually, finally utilizing over 64mb of memory.

Many nights were spent on converting PNG assets to vector drawings..


Yea, many nights of mine were spent converting finger movements into headshots. So your work is appreciated.

It is just amazing to me that it was such a downgrade. Granted, the Xbox Ones menu system has improved over the years. But it still is sloppy feeling, laggy at times, and just overall non user friendly. Like you have some menus where you have to press different buttons for actions (like inviting people to a game) and otheres where you select the option with 'A'.

I may be a bit jaded because it was so awful at launch, but I still feel like it never got close to as clean and usable as the 360.

I will stop ranting now.


100% agree... If they could steal a top UX designer from, say, Apple and start over from scratch, I feel like they could build something amazing. I love the XBOX and have been a supporter since day 1, but the new dashboard interface keeps getting worse and worse.


IMO the media center wars are being won by the actual TVs. We bought a new Samsung TV (SmartTV) over the Thanksgiving. It provides the usual built-in apps (Netflix, Amazon, Hulu etc.). Media Center experience is much better on this TV compared to other devices: No input switching, native 4K, fast switching between apps, simpler remote experience - no switching activities on a universal remote and turning devices ON/OFF.

At this point Apple TV is essentially obsolete for us. We do use XBox for gaming.

IMO XBox is better off just getting rid of the whole media center thing and focus tightly on the gaming experience.


Media Center experience is much better on this TV compared to other devices: No input switching, native 4K, fast switching between apps, simpler remote experience - no switching activities on a universal remote and turning devices ON/OFF.

Unfortunately, they have an history of not being updated and even worse executing all kinds of spying activities on their users. [1]

We bought a dumb-TV and hooked up an Apple TV. At least Apple respect privacy a little more. We also have an XBox One, but we rarely use it. We only have time for quick games. The Apple TV has a nice selection of titles that you can play for a couple of minutes. In contrast to the XBox, it doesn't take ages to actually start up. (Yes, I know you can use standby, but it uses 15W.)

[1] https://www.wired.com/2017/02/smart-tv-spying-vizio-settleme...


The Xbox Amazon video streaming app has always been pretty crappy. Significantly worse picture quality than streaming on a computer (unwatchable in my opinion), and odd/exceptional choices for navigation. Much of the blame for this is on Amazon, but some is on Microsoft as well for allowing Amazon to release such a poor quality application. I guess that is to your point on Apple vs. Microsoft: Apple wouldn't allow such a poor quality app on their platform, Microsoft gives them enough rope to hang themselves and doesn't stand in their way.


OTOH, I have an Xbox One and a 3rd Gen Apple TV and I much prefer the UI for video streaming on the Xbox to the Apple TV. Scrolling through things takes forever on the Apple device, while the Netflix and Hulu apps on the Xbox are a breeze and I can scroll through content with ease. Though, that's probably a function of the controller (much more input options on the Xbox controller than the Apple TV "remote").


My #1 UX issue is that it makes me manually sign into my profile every time I turn it on, which is a clunky and overlong process, even though I only have one profile and I ALWAYS use it.



Do you have "Instant sign-in" turned on for that profile? I've never had an issue with that.


Agreed - I watch everything over Xbox since it streams audio to my headphones. Finding apps consistently took me weeks to figure out.

The other problem is that there is no unified control scheme for streaming apps. In theory, you are giving each app flexibility to create a unique experience. But in practice, you need to learn the random button mappings for FF, REW, etc. on every streaming platform.


hmm... I've never had any of these problems.

The first time I used it I spent about 20 seconds looking for the "my games and apps" button, and since I've found it I've had zero other issues with the UI.


I wholeheartedly disagree... It takes 5 minutes to learn the UI if even that long. You have pins, you have "blades" (they're just the top items that you navigate with bumpers) and a left side bar now.

My gut says way too many people look for reasons not to figure out things and blame the product for having shortcomings that are entirely self inflicted.

as for amazon, their app is rather crappy almost universally. Try netflix or Hulu..


If you own a PS4 or an X1 and the money to buy a PS4 Pro or Scorpio, it seems like the best move is to buy the upgrade of the platform you don't have. Then you've got access to both platforms' exclusives, and at least one new-spec machine for the games that aren't exclusive.

I think the most interesting thing is going to see how compatibility works out long-term for Scorpio. What will the cross-compat story be for the next console, and the one after that? Will they re-platform and lose compatibility again (and bring it back via emulation) or keep re-spec'ing the current platform? When will we see games that will play on Scorpio but not X1?


> If you own a PS4 or an X1 and the money to buy a PS4 Pro or Scorpio, it seems like the best move is to buy the upgrade of the platform you don't have. Then you've got access to both platforms' exclusives, and at least one new-spec machine for the games that aren't exclusive.

As someone who owns both consoles, after seeing the lack of enthusiasm at the PS4 Pro launch event I decided to hold off until Scorpio specs became known - I much prefer the Xbox "experience" but outside of exclusives it never gets much use since my "high-fidelity" gaming goes on my PC, and "console exclusives" usually are better on the PS4 (looking at Final Fantasy XV in particular here).

I'm glad I waited, there are games I want to play that are console exclusives but not tied to one platform or the other (Kingdom Hearts 3) - it looks like Microsoft hit this one out of the park. Seeing as the HDD in my OG Xbox One is going to fail soon (man is it noisy) the Scorpio looks like a good upgrade path.

> I think the most interesting thing is going to see how compatibility works out long-term for Scorpio. What will the cross-compat story be for the next console, and the one after that? Will they re-platform and lose compatibility again (and bring it back via emulation) or keep re-spec'ing the current platform? When will we see games that will play on Scorpio but not X1?

I believe the intent is that we are entering an era of smartphone-esque spec bumps to consoles, and getting rid of the full generational gap that has historically existed.

From here on out, expect new hardware every couple years that will be compatible with your existing library - after existing hardware is X generations old it will stop receiving software support and you'll have to upgrade. Likely, when the Scorpio+1/2 is out you'll see the original Xbox One and Xbox One S losing support for some newer titles (but they'll likely continue receiving support from indie developers and less intense games along with system software updates for a while after that).


> From here on out, expect new hardware every couple years that will be compatible with your existing library

So, basically a computer, but limited, and a lower bar to entry.

I understand the use case for those that might not have a traditional computer or a very old one, but presumably everyone here would be better served by either shelling out an extra $100-$200 every couple years for a better video card in their laptop (if that's the only system them have), or dropping that on a discrete video card and sticking it in their current desktop?

I'm having trouble finding a case where I'm not better off buying a cheap Dell desktop and throwing a mid-range video card in there for approximately the same cost. It's a little bulkier, but I imagine Steam's Big Picture mode probably does a good job of the interface.


You're not exactly wrong, but you're not 100% right either.

Consoles have one thing that PC's don't, a streamlined experience tailor made for your TV. People do more than just play games on a console, they stream Netflix, interact with their friends through sharing videos and clips, listen to music on their home entertainment system that's already connected to the TV.

Valve has been trying REALLY hard to get to this point, but every time they add something new to the Steam platform it feels half baked in comparison to Microsoft's offering in particular.

Personally, I'm a PC gamer first - the lack of an integrated experience doesn't bother me when I just want to play games. My friends all use Discord for voice/text communication, I can use AMD's software to handle recording video and share it wherever I please, or use OBS to stream to twitch. But, I'm also a professional who uses computers all day and know how to put pieces together to fit my workflow - my friends are at least competent enough to do the same, but my wife or other groups of friends may not be.

Some people just want to buy a box, plug it in and have everything they want. That's the selling point, and unless Valve pulls some huge overhaul out of their butt's that is an advantage that consoles are going to retain.


Yeah, I understand for the general public, and I wasn't trying to imply there's no place for the new consoles, just that I figure most the people reading HN could probably get away with putting that money towards their main computer platform more effectively. That does, of course, assume mostly single use, and that's a rather large caveat I didn't cover. I have three kids, and for a set-top box, ease of use is extremely important, since it's not just me using it.


I'm a typical HN user I suppose. Own/run a software company, we even code in Microsoft's tools! However, I do 99.9% of my work on a Mac laptop.

I've installed Steam on said laptop but it's just not the same as having a console.

I love my XB1, it just works. I buy a game, pop the disc in, download a metric shit tonne of updates for a couple of hours (a rant for another time) and I'm good to go.

I don't care about 60FPS or 4k resolution. I care about being able to use my minimal spare time to shoot teenagers on COD. I'll sacrifice an optimal gaming experience, for ease of access.

Just another thought.


> I've installed Steam on said laptop but it's just not the same as having a console.

So, assuming the games you want to play are available on Steam for the Mac (a big if, likely), how bad is an Xbox USB controller, HDMI out to your TV, and Steam Big Picture mode? Is it sub-par, or have you never bothered because you have the xbone (or because it doesn't have what you want to play)? I have neither, so I honestly don't have any data on this, and I don't use Big Picture because I do my gaming on a desktop.


Big picture mode is OK, but plugging my laptop into the TV and launching steam is a multi-step process, esp. compared to just pressing the playstation button on the playstation controller (everything's already hooked up, it just comes on).

I also have a gaming PC hooked to my TV and I run big picture mode on that. It's slick, but it's not as smooth as a console. The controller will wake up the computer, but I have to hit the button twice (once to initiate the wake-up and once to connect the controller, which fails to connect on the first press because the computer isn't awake yet) and then there's a 50/50 chance that the controller won't let me get past the windows 10 login screen.

The console is an appliance made for running games, and the whole experience is so much smoother.


I tried using the Steam Link, and it's not user-friendly at all. I have a dual monitor setup with a USB KVM switch, and I cannot manage to get the Steam Link going to the TV with sound on and switch to the non-Steam computer with my KVM. Steam Big Picture works fine until there's a single problem and then all bets are off. Games not written with it in mind will pop out to situations where you need a keyboard and mouse at the strangest times.


Yeah so that's a good point about game availability. I only installed Steam on the Mac to play Kerbal Space Program! Every now and then I'll have a flick through the titles available but nothing there grabs me. As a result, I've never bothered plugging it in to the TV to have a play in Big Pic mode.


I have a PC hooked up to my PC and a PS4 as well. And I'm sorry to say that unfortunately the PC experience is still subpar and way behind the consoles (even though both Sony and Microsoft are working hard to make this experience terrible as well). Biggest issues:

- Big Picture mode is pretty buggy - it tends to freeze after sleep, sometimes won't recognise controller after waking up, it won't properly gain focus after boot, etc. etc.

- Windows is annoying to get into proper "TV/BigPicture" mode. It'll lock screen (forcing you to type in password), show popups and steal focus from BigPicture (again forcing you to look for keyboard/mouse), install updates and demand reboot, etc.

- A lot of games have launchers which don't support a controller so you need to hunt for keyboard/mouse again to start them.

- A lot of games aren't available via Steam and you need to use Origin/Uplay/whatever. Those don't support controllers or a TV mode making experience a hassle.

- Getting 5.1 sound over optical to work properly it a pain in the neck. On PS4 you just enable DTS bitstream. On PC you have to install hacked drivers (if your MB manufacturer didn't pay for license) and then hack XAudio DLLs to get 5.1 sound. And even then it might not work with some games.

- Some games tend to show smaller fonts and/or not support controllers properly in their PC versions.

- Some games won't run as well as on console if you don't have top of the line hardware - even though the PC is faster.

- Multiprofile support is utterly atrocious. On PS4 my GF just chooses another profile and can play same games as I do isolated with different achiements/savegames/settings/etc. On PC it's an utter pain - Steam doesn't behave well if you PC has multiple usernames (!), switching users on Windows isn't possible with a controller, you need to install things twice or just not be able to play same game in parallel.

- My PC has wierd issues randomly waking up from sleep, forcing me to shut it down. When shut down you can't power it on via wireless keyboard/controller making it another hassle to deal with. PS4 wakes up with controller without issues.

All in all, while games do run faster on my 970GTX vs. PS4, the UI experience is painful at times and I can see me and my GF go to PS4 just because it boots significantly faster into game and doesn't force us to hunt for mouse/keyboard to deal with the "issue of the day". On the other hand, PS4 in "rest mode" will install OS updates, game patches and other things while sleeping so it's pretty much always ready to go (unless you just bought a game and need to sit through forced installation).


I'd have expected most people to have a laptop at home. Maybe I'm unusual, or maybe the people I know are unusual, but I don't know many who use a desktop PC at home any more.

Desktop work PCs are still common, but you're not going to spend your own money on them.


I have a home workstation because I do a lot of work from home. Working from a laptop is a sub-par experience because of the form factor. Right now, I'm using a Dell XPS 15 laptop at home, but it's docked to a desktop keyboard and mouse, and a 40" 4K TV I use as a monitor. Prior to the laptop (which I almost never move, bit like that I can if needed take with me), I had a desktop at the same desk. So, right how I'msort of hybrid, but prior to that I had a desktop at home (and a separate, older laptop that I almost never used).

As for "most people", it's important to consider different types of situations, such as families in which there might be a shared computer, or a desktop in the room of one or more children because it's sometimes more economical.


I don't know many, either. Even mine has been sitting idle for a while, because it's in my (toddler) son's room, I don't have anywhere else to put it, and it's honestly begging for an upgrade at this point anyhow.

Which is too bad. It's got a nicer keyboard+mouse, sound system, screen, storage, network connection, and graphics than my laptop. There's a dedicated desk and chair. The only place the laptop wins out: Convenience.


I'm the only person in my family that owns a desktop myself, but my group of friends is mostly a bunch of desktop users since one of our primary use cases is PC gaming (although mine is also a workstation for software development and some light video editing).

Regardless of their quality, you can get a basic no-frills laptop for $400 and be able to carry it with you - or you can spend that much on an equally no-frills (though still arguably better specification-wise) desktop and have it stuck wherever you put it. More people tend to favor the portability, they can have it at the kitchen table or in front of the TV - some people don't even own a proper computer and rely entirely on smartphones and/or tablets.

Desktop computing outside of the workplace is increasingly a niche, one that isn't going away as long as PC gaming remains but to argue it's not shrinking is delusional.


I have a desktop at home. I despise tiny laptop screens and tiny laptop keyboards. And touchpads. Desktops have advantages when it comes to performance and cooling, which is a big deal for a gaming system.


You'll never get the same performance for the same cost on PC. They showed forza horizon 3 running at 60 FPS/4k on the scorpio, in a stress test with wheather effect + the maximum amount of cars on screen, while only using 66% of the available GPU power. And that was on a version of forza that was not even yet optimized for the scorpio, all they did was spend 2 day updating the engine to run on scorpio. I'm not sure how much the scorpio will cost, but the NVidia 1070 struggle at running forza horizon 3 at 4k at over 50 fps. Getting anything more than a 1070 will likely cost you close to the price of the scorpio just for the GPU.

Specs to specs, PC would seem like a good deal for the price, but with all the optimization that goes into tweaking consoles to make them perform better on games, its rarely the case for a PC to outperform a console for the same price. For example, they announced a hardware implementation of DX12 into scorpio, which isn't available on PC yet. We don't know how much this will affect performance, but its the kind of things that plays in favor of consoles.


> You'll never get the same performance for the same cost on PC.

In a direct from-nothing-to-full-system comparison, sure. But if you've got a desktop already, your costs might just be the Nvidia card, and maybe some more RAM if that's lacking. Newegg has Nvidia 1070 cards that advertise DX12 for $370. It looks like a 1080 will cost about the same as I imagine the Xbox refresh will cost for about 15%-20% more power. By the time this is actually released, a good 6+ months from now, I imagine those prices will probably be much better.

That's not the only thing that matters of course, but there are benefits to updating your general purpose machine, in that you might make use of that extra power in other ways (e.g. extra RAM making the system more performant in general, not just for games).


> But if you've got a desktop already

Killer right here for an increasing percentage of the population. Fewer households even HAVE a desktop PC these days, laptops, tablets and smartphones are the computing devices of choice for many these days.

Even if you HAVE a desktop, there's no reason to assume the power supply can handle a dedicated graphics card if it wasn't shipped with one. Many OEM's like to skimp on power delivery at the price points the majority of consumer desktop PC's are sold at, since a $400 machine has little profit margin in the first place.


> Killer right here for an increasing percentage of the population.

Sure, but that's why I specified everyone here, as in the typical HN reader. There are plenty of things that don't make sense for the average person but make sense for particular groups. I wouldn't recommend a Raspberry Pi game/set top/server to the average person, but I might to someone here.


I would have thought most households (in the US, at least) have a PC for taxes and youtube videos.


Youtube videos? Phones. Tablets. Chromecast, Apple TV, game consoles if you want it on the TV.

Taxes? Possibilities: 1) someone else does it, 2) you do it but all on paper, 3) work device (maybe even at home if it's a laptop), 4) something something public library. I don't think tons of people would miss their personal PC bigly come tax time (some would, surely).


TurboTax has an iPad app. Also, it works fine in a laptop as well.


I'm glad we're seeing utilisation at the 66% level. That extra power is going to be mighty useful for VR. It's going to be really interesting to see what Microsoft reveal there.

I have an Xbox One and a Switch. Personally, I love this combination and Scorpio could be a great upgrade depending on the VR story. While I do miss out on some PS4 games I'd love to play, I don't have enough hours to play the games I already have.

The ideological gap between Switch and the rest of the industry is widening. No doubt the Switch is a cheaper platform to develop for (4K assets are a huge burden) but is that enough of a drawcard? Likewise, are consoles like Scorpio and PS4 Pro becoming too difficult and expensive to develop for, unless you are a AAA studio? The bar gets set so high that smaller studios will find it harder and harder to keep up. I can easily foresee a future where the Switch has an amazing selection of first party, Nintendo games and a killer catalogue of indies, whereas Xbox and PS4 own the third-party, AAA market.


Actually it feels like the trend you are starting to see is that AAA studios can't even keep up (aka Mass Effect Andromeda, and other recent high profile releases). The graphics burden just seems too high that everything else suffers.

But it's not all bad. I feel like, especially on the PC, smaller indie studios have been flourishing and some of the best games in recent times are not AAA nor graphical powerhouses.


> Actually it feels like the trend you are starting to see is that AAA studios can't even keep up (aka Mass Effect Andromeda, and other recent high profile releases). The graphics burden just seems too high that everything else suffers.

The problem with Andromeda was animations, not AAA graphical fidelity. I don't think many people are unhappy with the current level of flashy effects, we're mostly just looking at higher resolution assets (which likely exist already).


Animations are a big part of graphical fidelity. Games with lower-detail character models relied more on your mind to fill in the details as characters spoke or made facial expressions (I still remember an incredible scene in 2001's Anachronox where a character slowly smiles, conveyed with the motion of three vertices), but when you have a near-photorealistic rendering of a human face, the animation has to be up to standard. This is every bit a drain on art budgets as the modeling and textures.


Wouldn't then be desirable to avoid showing human faces up close and, instead, explore other ways of telling stories that better match both budget and technical resources?

Insisting on showing a character in great detail when the costs of doing so far outweigh the benefit is unwise.


If your game is a third person shooter which is a sequel to third person shooting franchise you can't really do that.


Note: Eurogamer has the official exclusive for this information.

The tech is incrementally better than the PS4 Pro, but the price is unknown and could be problematic if released at $499 compared to the PS4 Pro's $399. And whether developers will take full advantage of that power.


On paper it's incrementally better--significantly higher clocks, more RAM and much more memory bandwidth. But the bigger news is that all existing Xbox One games get 16x anisotropic filtering, framerate boosts, v-sync, and resolution bumps (if the game supports dynamic resolution). Not to mention system-wide downsampling for 1080p TV's.

Should be a pretty big bump in image quality for Xbox One games. They demoed Forza running at 4k/60 FPS with PC Ultra-equivalent settings at less than 90% GPU load.


Not sure why but some of your previous comments are showing up [dead] in the threads.


Replying to the dead comment below, his post history looks great to me, which is why I felt compelled to point out the dead flag.


OT, but I'll keep it short, and the user should now it is okay now: you are right. I vouched for the comment in this thread and that seems to have helped.

Looking at the history, it's probably the anti-trump comment that got flagged.


Direct X CPU instruction that were handled by the CPU is now handled by the GPU!

https://www.youtube.com/watch?v=RE2hNrq1Zxs&feature=youtu.be...


Eh, that's not the biggest deal, it's basically always been like that on consoles, and that's the whole point of Mantle/Metal/Vulkan/DX12. ie. the fact that GPUs have MMUs these days, so user space can just write raw to the command buffer without being second guessed by the kernel drivers.

The switch to DX12 is just that you can write the same sort of code on Windows and Xbone (the 360 had enough extensions that you sort of had to have separate rendering backends for 360 and Windows, despite them both potentially being DirectX 9).


Maybe I am completely off the marks here, and I'm sorry if this is off topic, but as someone who follows very closely what Apple is (or isn't?) doing in their Pro machines I am very curious how those specs compare to what everyone is asking for from Apple. The RAM is obviously low, fine, but it sounds like the CPU and GPU are significantly more capable than anything that Apple is shipping right now, and Microsoft will sell this thing at a fraction of the price that Apple will sell any pro hardware. What am I missing? Where is the big gap in component cost? How is something like an Xbox so different, and so much cheaper, than a pro level desktop?


Microsoft is running a custom SoC that is tailor-made to their platform, with shared memory, power delivery and internal communication channels for the CPU and GPU. This right away reduces a lot of costs you see in a traditional professional desktop, where you have a GPU with separate power requirements, dedicated VRAM, and all the hardware needed to communicate over PCIe on both sides. Not to mention the Xbox One uses really weak (in comparison) Jaguar cores, instead of something much more expensive like Ryzen or Kaby Lake.

The CPU+GPU alone can add up to over $1000 retail on a workstation, and that's a consumer CPU and GPU - once you start adding "workstation" graphics like AMD FirePro or NVidia Quadro territory the GPU can cost over $1000 retail alone for validated drivers to support CAD applications, etc.

Factor in all the extra components that go in to support upgrades (memory sockets, PCIe slots, external connectivity like Thunderbolt) and the fact that these components use much more power than the Xbox One SoC and as such require more cooling and you can see how costs quickly add up.

Now, with all that said - the Mac Pro is pure price gouging. You can get an equivalent workstation from HP for a fraction of the price, but then it doesn't run macOS.

Personally, I find it rather amusing. I remember when the Intel cheese grater Mac Pro was first introduced and Apple was showing off an equivalently specced Dell workstation was more expensive than the Mac Pro. Apple has really lost their way in the professional space (see everything they've done with the MacBook Pro since 2012 as well).


The Mac Pro was actually pretty well priced for what it was offering at launch. The price just has stayed constant and everyone else kept improving their hardware.


A big part of it is the type of graphics card. FirePro (the kind in workstations) are much more expensive than normal general purpose graphics cards, which are again more expensive than custom SoCs which can be developed barebones for a specific use. Reason being that a calculation error that results in a dead pixel is fine when you are playing a video game. It'll be there for 1/60s and then be gone, never seen again. The same error in a disney film or a 3d scene rendered for a poster needs to be pixel perfect, so workstation class cards have a much lower threshhold for error, and cost accordingly.

(Also, and this is just conjecture: its possible that Apple intentionally overprices their pros as a sort of "look this is premium" cost, while the goal of an xbox is mass market sale).


> Reason being that a calculation error that results in a dead pixel is fine when you are playing a video game. It'll be there for 1/60s and then be gone, never seen again. The same error in a disney film or a 3d scene rendered for a poster needs to be pixel perfect, so workstation class cards have a much lower threshhold for error, and cost accordingly.

I've heard this a lot, and I don't doubt that it's correct, but could you explain why? Like do consumer class GPUs have less accurate floating point, or do their embedded algos contain hacks to produce less accurate results faster?


In the past, I remember being told that the fixed-pipeline cards could push a ton of vertices, but that they couldn't hand the same volume of textures and rendering effects as a consumer card.

One of the first things on the FirePro and Quadro Wikipedia pages is that the actual graphics chip in the hardware is the same as consumer levels. Aside from hardware tweaks like ECC RAM and possibly different display connectors, I think that the biggest differences are the workload that the drivers are optimized for and reliability guarantees for the results that the hardware produces.


Unfortunately I'm not familiar enough to give you a conclusive answer. One big thing is that most consumer-class GPUs use normal memory, whereas workstation-class use ECC memory, which can correct for bitflips that might occur during normal operation.


HBM is ECC by default, so that distinction is slowly vanishing.


That's not true. Loads of stuff is offloaded to GPU, which would cause a complete crash if it didn't work. GPUs aren't just layering rasters on top of each other.


It is though, here's someone else saying essentially the same thing: https://superuser.com/questions/690388/why-do-workstation-gr....

Drivers that prioritize accuracy, and ECC ram fall well within the realm of "lower threshold for error", and price discrimination is covered by my parenthetical.

That said, the reason that custom SoCs are often less expensive than mass market cards is just because they have fewer things on them. If you don't need an FPU (which you obviously do here, but as an example), you can leave it off, which saves silicon and saves money. General purpose cards needs general purpose things, but a custom chip can often leave out certain components, although I don't know which ones those might be in this case.


It is true. Workstation graphics cards often have ECC memory, consumer cards don't. The biggest difference in most use though is better fp64 support on workstation cards, important for CAD applications as well as some use cases with 3D animated films or special effects.


Animation software works just fine on geforces (no-one uses AMD for professional work) - you don't need fp64 for anything. Rendering for film is 99.9% CPU-based, and even if that weren't true, all the GPU renderers on the market don't make use of fp64 anyway.

The only reason to use workstation class cards is because the software is certified to run on them (I.e. You won't get support from AD if you use a GeForce instead of a quadro). The only real reason used to be memory but with 11GB in a 1080ti even that's less true these days.


Actually the difference between a top consumer graphics card and a professional card is just the drivers. A few years ago the president of nVidia went as far as to say that we was running a software company that monetized their software with "dongles" (the card). It used to be the case that cutting or adding a link would turn a $500 consumer card into a $2000 pro card: the link would be detected by the driver to enable pro features.

Also, aren't Disney movies raytraced?


> Also, aren't Disney movies raytraced?

Yes.


They're really just too very different things, on many, many levels.

For general computing purposes, an Intel i5 based system would be significantly more performant than this, but the Scoprio is highly, highly optimized for its specific task.


Traditionally consoles are sold at a loss at first and the as components come down in price it evens out. Last gen though xbone and ps4 were essentially sold at cost. Most likely this is sold at cost and then they make money off of games. Apple needs to cover costs of OS and make a profit off of the computer alone and people are willing to pay a premium for Apple.


> Traditionally consoles are sold at a loss at first

That has not been the case since this generation, and even most Nintendo consoles were not sold at loss. the Wii was already profitable at launch, hardware and price-wise. Plus, looking at sites like iSupply is not a good estimation of the real cost of consoles, since manufacturers who order millions of parts do not pay the same price per component as consumers.


You're not missing anything. If this console launch is anything like the previous generations the hardware will be an amazing bargain when it comes out. Comparable to a high end gaming PC or workstation for half the cost.

The only difference is that with Apple they will be expected to make a decent profit margin on that hardware and update it with newer specs in a year or two. With consoles, they will continue selling the same hardware for 5+ years so it is okay if they start selling at a loss initially if it means more money to be made from game sales.


Don't know if it's still the case for new consoles, but the ps3 was highly subsidised when it came out. Something like that would account partially for the price difference.


AMD vs Intel components is one part of the equation.


I've never seen anything in an apple desktop that suggests they aren't ludicrously overpriced. You can piece together a monster of a desktop for 2k, so I don't get it either.


If you're going to downvote I'd love to hear your reasoning. What about a Mac Pro justifies it's price?


Does anyone know anything about the "Hovis method" they reference for fine tuning the power profile for each chip? I'm having a tough time understanding how tweaking the voltage rails from board to board can improve power consumption/heat dissipation without impacting CPU performance.


Normally, yes. Due to process variations each chip has a slightly different maximum frequency for a given voltage. Usually chip makers test each chip, sort them by how fast they can go at a standardized voltage, and sell the faster ones for more money under a different model number - sometimes also fusing off features like multi threading.

It sounds like what Microsoft is doing is standardizing on frequency and giving different chips as much voltage as they need to hit their performance targets. So everybody buying one of these consoles gets the same performance but some people will have more power hungry consoles than others. Possibly those people will get better heat sinks?


Or same heatsink, different fan speed. Some people get a quieter one, and others sound like a jet engine.


This should provide some amazing graphics on a 4K screen! The console programmers do some incredible feats with the resources they have available.


This 4k screenshot is really impressive, and as the benchmarks at the top show, there's lots of headroom: https://cdn.gamer-network.net/2017/screenshots/Forza-Tech-Sc...

Takes a minute to load, I think their cdn is a little overloaded currently.


Great shot, 17 years ago I predicted that within 10 years you would not be able to tell the real from the computer generated. This is the first screen shot that qualifies IMHO, because this comes damn close to looking real.


Unfortunately they didn't have a screenshot of Forza running at Ultra level settings, which is mentioned in the article as being achievable, because I think the line would be blurred even further.


I hope Ultra level settings add a fair bit b/c the aliasing on that screenshot is incredibly obvious everywhere but especially on the cables holding the starting lights and the barrier fences. It looks good but it isn't real by a long shot.


I see it when you mention it, but it did not standout for me by any means. And it might be an artifact if the image is being downsize automatically for you to view it (I know it was for me).


and it's a time when I am least interested in video games. The hardware is what has been here.


What I find interesting, is the parallel in movie content, where 4K UHD content just became available late last year (while 4K movie content was available earlier, all the stuff I'd seen was poorly encoded or heavily compressed). I see a lot of people on AVSForum complaining that there's absolutely no visual difference between 4K or 1080p, sitting on a couch at normal distances, based on scientific arguments of the human eye's resolving power, or their own subjective experiments.

Given that the market reception to 4K TV's and UHD players seems to have been tepid at best, I find it interesting that Microsoft thinks 4K will be such a huge draw to consumers. Is it just bragging rights (my 4K's bigger than your 1080p)? 4K rendering will really make the 1080p look that much better? People playing these things will sit 1-2 feet from the screen? Or maybe VR headsets will benefit the most?


Definitely interesting. Regarding the human eye, it varies greatly from person to person how well they see. However it's not even only about the eye itself, but also how well suited & trained [1] your brain is to interpreting the signals.

I personally have always had perfect vision, and have always been able to notice even fairly subtle differences in resolution. My brother also has perfect vision in the classic 20/20 sense, but has to inspect closely to tell apart even 720p & 1080p movies. Our ability to notice differences in bitrate is even more far apart. [2] It's not even only about picture quality, but also about refresh rates. I can tell the difference between 100 Hz & 144 Hz, and I'll even notice bad frame pacing on an otherwise stable framerate. At the same time I have a friend who didn't notice anything different about the 48 fps variant of The Hobbit compared to other 24 fps movies.

It probably has a lot to do with what we spend our time doing. In that I've spent most of my life hunting for better picture quality & higher frame rates. Compared to the average person I must've spent an unbelivable amount of time thinking about & observing picture quality & frame rate. Thus I have probably developed a skill for this.

--

[1] The training can be an implicit side-effect of other activities.

[2] Interestingly this is for video bitrate. The reverse is true for audio bitrate. My brother being a lifelong audiophile claims to hear the difference between even 320 kbps MP3 & FLAC.


Yeah, I have a nice 4K tv and I still think 4k is a bit of a red herring. I even moved my couch a foot closer. The only thing I noticed is that I can now distinguish between 720p and 1080p where I couldn't on my old 1080p tv.

I think 4k is more important on 25" to 30" computer monitors that are perched just a couple feet away. I'm not convinced it makes a difference on a wall mounted TV.

I think the more exciting bits of progress are the wide color gamut and HDR.


The Xbox Scorpio SoC is RX480 again, Polaris chip on steroids - coupled with more GDDR5 6.8GHz@326GB/s BW vRAM 12GB with reserved to OS @1.5-2GB Plus a quad core Jaguar cluster added in with Vapor Chamber & full DX12 HW improvements. That should be a slightly OCed GTX 1060 level of perf looking from the specsheet, Plus they are banking on that DX12 due to the CPU being Jaguar still, Also the GCN HW of the AMD thus helping it to boost the ASync compute. Prev gen was also a RX480 based but cutdown chip less than cut down GM204 perf (<980M)

And It did 4K forza which was a direct port from the Studio 10 to the Scorpio from XB1 @60FPS/~64% GPU usage, Mobile GTX 1070 and up should be still outperforming it & when it comes to proper optimized titles like BF1 / TF2 / TW3 / MEC etc will aid further to render more details, Also running old 4K forza on Console GFX options vs PC GFX options is not an apples to apples comparison. Note that TF figure on AMD and Nvidia aren't same either...

Consoles can never reach PC levels of detailing due to the market and targeted audience. I'm a PC gamer, I can't stand to the degraded visuals on the Consoles, Also the Nvidia Tessellation is superior to AMD. Sad part is due to the % of market the PC games get downgraded / unoptimized mess due to these consoles Also the XB1Scorpio doesn't add any significant advantage over the PS4Pro because of less than stellar exlc. titles & Checkered board rendering used. Finally the native 4K@60Hz or 30 isn't out, I highly doubt an RX480 can run the games at 60FPS 4K, the power isn't just there for the recent games and upcoming powerful engines developed with Pascal HW will outperform this weak chip soon..


It's interesting that, for the second time this week, a big company has given journalists an exclusive and sort of intimate look at their future roadmap instead of saving it for big keynote surprises.

Between leaks making nothing ever a surprise anymore and the general jadedness at products never living up to their keynote claims, I wonder if this signals a kind of paradigm shift in "how to hype a product". Certainly all the Apple commentators seemed very pleased that Apple was breaking with their traditional radio silence, and this made me much more interested in Scorpio than I was recently, since I can see from a skeptical source that it's not just hot air.


Hrm, I can't help but think that despite this being only slightly more than Sony's PS4 Pro spec-wise, that difference is going to matter a lot because it seems to be the difference between being able to output (upscaled) 4k and being able to render 4k. We'll see if that turns out to be true when the Scorpio Xbox One actually launches (this kind of pre-release article is very prone to botching facts like this). If it does hold then this machine actually has some hope of switching the preferred platform for console games back over to Xbox One.


Does it really make much of a difference?

Even for large screen TV's, you would need to sit about 3-5 feet from a 4K TV to make out every pixel.

3k rendered, then upscaled to 4K should be pretty much indistinguishable to 99% of viewers.


I hope this is more than just putting out 4k games. Most people don't have 4k TVs, and, very importantly, for those that do, most don't have a big enough one for how far away they sit.

I'm thinking of upgrading to a 65-inch OLED this year, and from my couch of about 10 feet away (maybe a hair closer), I won't be able to tell 1080p from 4k. And that's a pretty big TV!

I would much rather see Scorpio games at 1080/60 fps with more effects, more polygons and more complex lighting.


This must be highly subjective, because for me, 65" 4k tv 10' away is like night and day to me compared to 1080p


There's no difference side-by-side between 1080p and 4K resolution if you're 5ft or further away from the TV[1], all else being equal.

However, all else is rarely equal, and 4K TVs now have lots of other features and technology that make them just better (prettier) buys in general.

[1] http://s3.carltonbale.com/resolution_chart.html


> A person with 20/20 vision

I'm honestly wondering if I'm an outlier and may have better than 20/20 vision. I would think it's pretty rare, but I have had people comment on my vision before. I work on a 17" 4k laptop screen with tiny text no prob.


Charts like this always get brought up, but I'm not sure there is much rigor in creating them.


At least with the 4k OLEDs, you are also getting HDR, way better contrast ratio, better colors, etc. So, you'll notice that, even if you don't notice the resolution difference.

I am excited to see Xbox support HDR. I just hate to see all of this extra hardware dedicated to resolution, when there are so many things that it could be put to.


The best 'gaming investment' I ever made was to take 2000 bucks and instead of building a new dynamite pc, I bought one of the best 1080p plasmas you could buy 3 years ago, and a ps4. I haven't looked back.

Consoles have come a long way, and the visual fidelity of a truly great quality display is a much better investment than a machine that can power a mediocre display at higher resolutions.

This is coming from someone who was gaming on ultra on 1440p for 5 years before switching back to consoles. It feels good to be back on the couch.


> I hope this is more than just putting out 4k games. Most people don't have 4k TVs, and, very importantly, for those that do, most don't have a big enough one for how far away they sit.

And don't have the bandwidth to download many gigabytes of forced game updates.


Not mentioned in the article, but apparently it will also support AMD FreeSync.

http://i.imgur.com/GUCJSAH.png


This is MS admitting defeat for this gen and going for an early next gen. BC gives it the mid-gen excuse though. This is a good move though and will put Xbox back on the map.


So Sony admitted defeat and made the PS4 pro?

I think more than anything, it's the fact 4k happened much quicker than expected..


Maybe. My thought was that the release PS4/Xbone were still too far behind PCs


People who buy consoles don't generally care how far behind PCs they are.


It sounds like MS has been able to get a lot out of their hardware. It will be interesting to see how much it costs.


I'm expecting to be along with predictions, being right around $500. I won't be surprise if it ends up being more, though.


I suspect this is using the new Ryzen cores from AMD? If so then that by itself is a big performance boost.


They specifically state it is not.

> "To be clear, then: Project Scorpio doesn't feature Ryzen cores, but the Xbox team are not so concerned about this. "On the CPU side of things, we could still meet our design goals with the custom changes we made," Kevin Gammill points out. "At the end of the day we are still a consumer product. We want to hit the price-points where consumers want to purchase this. It's about balancing the two.""


I'm excited by the specs, especially given that Red Dead Redemption 2, Battlefront 2, Call of Duty WW2, and Destiny 2 are all hitting this holiday. Should be the best way to play those games.

Although I wish they had more unique games coming out alongside that stuff. Playing Zelda on Switch is a good reminder that eye candy doesn't matter all that much after the novelty wears off.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: