> The SoC also contains a cluster of four A53 cores for power efficient processing, but Nintendo has chosen not to use them.
The rumor I've heard is that there's a bug in the system crossbar which makes which core CCX you enable after reset the only choice you can make until the system is fully reset. That is, if you enable the A57 CCX, later enabling the A53 CCX triggers the bug and vis versa, even with the first CCX disabled when enabling the second.
This is additionally odd because according to Wikipedia, citing the Technical manual (wherever you can find that), the A53 cores were so borked that later versions of the manual removed all references of their existence. The Tegra X1+ shipping in all Switches since 2018-2019 might, possibly, not even have the cores.
Does anyone know what this "bug" manifests as? Your first sentence implies it's a lockout created at reset, but your second implies it manifests post-reset through user/programmer action.
I think they're saying you get to run one CCX at a time; either the A57 cores or the A53 cores and you can only change that once after a reset. Presumably the boot time CCX is selected by selection pins. Depending on exactly what reset means, this might be something like the i286 that couldn't leave protected mode without a reset, so systems were built for OSes that entered protected mode where they could set up a reset vector and then cause a reset and jump back to their real mode kernel. Or, it might be a very intrusive reset that results in memory contents being reset --- that's not going to be something worth engineering around. Either way, it's easier to just only use the A57 cores to begin with, and especially if switching between the two is problematic.
Assuming it is some "at reset" selection, Nvidia advertising it as an eight-core chip would be deceptive, so I'm reluctant to believe such a theory. I wouldn't put it past marketing to do such a thing, but later revisions of the manual don't mention the A53 cores, so I'm inclined to believe it's a hardware bug.
Unfortunately, without someone from inside Nvidia telling us, all we have are rumors and no evidence.
ARM big.LITTLE systems started out as use one or the other, but not both simultaneously. Advertising those as 8 core when it's really big 4 or LITTLE 4, or later incantations where you could use the big or LITTLE of each of the four paired processors, is sketchy, but was common.
If the plan was to allow big 4 or LITTLE 4, and then a hardware bug became apparent that you could switch to LITTLE cores but not back to big cores, well you notify customers and stop advertising the LITTLE cores. archive.org has them mentioned Jun 1 2016 [1], and then removed Jun 14, 2016 [2]; the A53 cores aren't mentioned but are shown in the die map on the current page [3].
Gunpei Yokoi's ethos of "lateral thinking with withered technology" [1] has been an guiding principle within Nintendo since the Game & Watch days: to use "seasoned" or otherwise imperfect technology in creative ways. Using a broken SoC that they probably got a great deal on that still fits their needs fits perfectly in that mindset.
Oh, please. Nintendo is infamous for being cheapasses with their hardware design decisions, working to boost profit margins over all else.
The N64 is a great example. Top notch graphics which they spent a small mint on...that was then hobbled by a storage format they could exploit more control over, resulting in ported games having to be chopped up.
The argument they made at the time was that the cartridge format was more than just making piracy more difficult. It enabled high speed streaming of data to enable games like Ocarina of time to essentially be seamless when going from location to location. Frankly they have been averse to loading screens for a long time. Even the gamecube had a format that served multiple purposes, having a security advantage(at the time of design) and enabling higher speed data transfers again because they hated those loading screens so much. These days I think the mentality has shifted a bit now that a newer generation is really calling many of the shots there.
>Top notch graphics which they spent a small mint on
From what I understood the design was a completed SGI design that was originally going to go to Sega but they passed on it and it went to Nintendo. This was the advent of 3D Graphics in the consumer video game market and so of course they were going to make mistakes. Its questionable how much Nintendo was involved in the low level design but they did end up providing a very decent graphics solution at a reasonable price point.
Hell the PSX has that terrible fixed point integer based calculation so it causes polygons "snapping into place" that causes the wobble effect so its not like they were the only ones making decisions that made the final product not as good as it could have been.
>resulting in ported games having to be chopped up
Maybe your thinking of it the wrong way. Software was more custom tailored back in the day to better take advantage of the custom abilities of the hardware because there just wasn't any breathing room for more generic ports.
Just imagine all of the n64 software that didn't even make it at all to the PSX because it was completely out of the question. Why should Nintendo be blamed for choosing what they felt were specific features to make their games the way they wanted? If other software does not fit the paradigm that the N64 provides then it is the fault of the software.
>It enabled high speed streaming of data to enable games like Ocarina of time to essentially be seamless when going from location to location
I recall reading this very same statement in a game mag during that time. It made a lot of sense considering how often the PSX would need to load in the map. Resident Evil 2 comes to mind, then MGS. MGS got away with the loading because it hid it behind the various FMVs.
>Hell the PSX has that terrible fixed point integer based calculation so it causes polygons "snapping into place" that causes the wobble effect so its not like they were the only ones making decisions that made the final product not as good as it could have been.
Yeah, then there was this. To be fair to Sony, I think they went cheap on the initial PSX as it was uncharted territory and I'm certain the finance gurus had pumped the brakes on being too lofty out the gate. Now the "wobbly" graphics are viewed as an aesthetic lol.
The memory is actually very fast, it just also has large latency, particularly coming from SNES where you essentially had single cycle access to anywhere in memory.
Rumor was nVidia was selling them at a discounted price since the only other thing (I am aware of) is the Nvidia shield using these chips so they went with them on that factor.
They were probably in on it when it was pre-production. This is something that you often want to do as a large customer so that you are as close to the state-of-the-art as possible, but it comes with downsides! I am doing a similar thing on a project at work at the moment.
Yeah, but how long was the Nintendo Switch in development? And bear in mind that we're comparing it to a much simpler bit of Nvidia electronics.
It is entirely possible that it was just price based and they didn't care about the chip bug, but given the timings I still think they would have selected the chip before it was complete and in consumer products.
Well, I think it's important to consider the competition in 2017 was not great.
The comparable competitive Android chip, also launched in late 2015, was the Qualcomm Snapdragon 820. That chip was widely known as being one of the worst Qualcomm chips ever made; for mediocre power efficiency, lots of heat generation, thermal throttling, and a buggy first attempt at 64-bit instructions. All that for a GPU that is, on paper, significantly weaker [1] (though, maybe the Switch's cooling could've helped close that gap a bit). But even then, you're dealing with Qualcomm, and everyone knows they are just the worst.
First, because Qualcomm loves royalties based on the device's MSRP, rather than a flat charge per chip. Nintendo probably wouldn't like that. Secondly, while NVIDIA GPU drivers are a proprietary blob, that's of little concern to Nintendo, and that blob can be easily adapted to run on any OS under the sun, including their own. Qualcomm - enjoy a hackneyed Linux fork, that's the best you'll get. From our perspective they're both pretty bad, but from Nintendo's perspective trying to add support to their custom microkernel Switch OS, one's clearly garbage.
Outside of Qualcomm... what else do you have for 2017? Exynos and MediaTek? I think it goes without saying... there are no upsides to passing on the Tegra X1 for a MediaTek from that era.
[1] Edit: I previously said 50% and 100% weaker, but that's very grammatically ambiguous; and FLOPs are a very bad metric of performance, because there are 3 different kinds of FLOP metrics floating around that aren't comparable (due to different levels of precision). Combined with the Tegra being designed for cooling and the Qualcomm designed for no cooling, it's hard to tell specifically how large the gap is, even though a gap is almost certainly there. I think my point still stands.
By 2014, the Tegra X1 was already picked as the Switch SoC.
From digging at history threads: The alternative SoC option they had was a quad-A53 SoC with Decaf (a Wii U GPU cut in half with Wii backwards compat gone) co-designed with STMicro.
It was going to be for "Deep Learning," "Computer Vision Applications," "NVIDIA DRIVE car computers," and robots. As we know, outside of some Tesla models, that didn't really happen.
Rumoredly, according to people who have read the leaked Nintendo documents (Modern Vintage Gamer has implied it in replies to comments on his videos), NVIDIA had found the bug with the recovery mode before the Switch's launch; but Nintendo couldn't just move the announced Switch launch date to wait for a chip revision, especially after the Wii U financial performance. So, off it went and they just had to cross fingers and hope nobody found it. Nintendo probably got a good discount for that mistake too.
> As we know, outside of some Tesla models, that didn't really happen.
As YOU know.
The Tegra X1 (which the switch used) was never used in any production automotive application, correct. But you mentioned Tesla, so let's talk about other Tegra generations.
Other Tegra generations were used in Teslas in varying quantities (Tegra 2, 3, and K1). Mercedes has been shipping Tegra in their "MBUX" cars for a few years now as well. A couple of Chinese companies are shipping Tegra via NVD. Volvo, Land Rover, and Jaguar are also going to be shipping it shortly as well.
I don't know why you accuse me as though I said something wrong, and then immediately admit that, yes, the Tegra X1 was never used in any production automotive application. That's what I was talking about - not Tegra as a whole. The only mistake I made was that the Tegra X1 wasn't used in any Tesla models.
A whole lot of the switch's design was about driving down costs as much as they could. That's why it has no mic, camera, and didn't get bluetooth headphone support for years after launch. Plus chipmakers really want those console contracts, even if they make a loss on the first few hundred thousand sold, they're assured sales as long as the console sticks around. AMD made out like bandits with the ps4/xbone.
Hmm, don't see why leaving out Bluetooth headphone support would save on costs. They enabled it with a software patch so the hardware was always there.
I've assumed the chipset was cheaper and they had to do less development/testing trying to get 8 controllers and a headset working at the same time. (it may support fewer controllers when a headset's connected)
I think its fine unless you are playing something like Taiko no Tatsujin. (And you don't really use hit sound when playing hardest level anyway, it's already too late when you hear it) Most games don't really need such immediate response.
The latency of non low-latency(LL) codecs is objectively poor though. Switch supports the SBC code:
> SBC operates at a latency of between 170 and 270 milliseconds (yanked from the web so it may be a bit off)
Stacked on top the typical input latency and you talking somewhere in the neighborhood of 1/4, 1/3, or even pushing 1/2 seconds latency from an input to hearing a sound that plays in response to the input.
An individual might get used to it, or it might not bother them, but it's still there and measurably poor. Even for straight video, Amazon Firestick and other devices will add video latency to compensate.
LHDC and aptX-LL on the other hand can hit 30ms of latency.
It's interesting that, on long running and well regarded (but obviously, not authoritative) site "HG101’s Top 47k Games of All Time" [0], the top 50 ranked titles includes virtually no games - 1? 2? 3? arguably - of a technically more advanced pedigree (than the Switch's capabilities), in terms of applied processing grunt, 'graphical fidelity', etc.
Expand that analysis further, to the top 100, and it remains true.
'Regard accumulated over time' would clearly be a factor in that bias, but not a definitive one. Relatively modest processors like the A57, and (vastly) weaker, are still the home of the majority of human video gaming enjoyment.
Your point is well-made, but I'm going to nitpick anyway.
Video games have been around, at this point, for 40-50 years. On that basis alone you'd certainly expect the majority of human video gaming enjoyment to predate the A57.
The Switch has been a commanding platform commercially for its lifespan; given the ease of pushing out multi-platform games with Unity or Unreal Engine, devs are more likely going to target the lower-spec hardware and get a huge install base for free than to limit themselves to more advanced hardware.
Nier: Automata is no. 5 on that list, and while it runs on the Switch, by-all-accounts it's somewhat knee-capped by the hardware. The PS4 original is what's listed, and one wonders if it would've received quite as much adoration if it launched on the Switch and looked or played the way it does on that platform, or even if the devs would have had the resources or freedom to develop it as-it-is in the first place.
Including the top 100, Bloodbourne is at no. 63 and, so far as we know, requires computing power the Switch doesn't provide. Outer Wilds is at no. 96 and you could raise similar questions about it as about Nier; it's made it to Switch (as of last week!), but it took the devs an extra four years to get it there.
It's, frankly, a somewhat quirky list. Among other things, it's heavier than most would be on Japanese visual novels, and it inexplicably has "Lode Runner Online: The Mad Monks' Revenge" at no. 35.
Also, don't underestimate the "regard accumulated over time"/historical import factor, which is a pretty universal element of how critics tend to approach these things. For example, the most recent (2022) Sight and Sound "100 Greatest Films of All Time" has no movies made after 2001 in the top 10; a full half of it predates what most people would consider "modern cinema" (roughly speaking the French and American New Waves in the 60s).
Of course, ultimately your point stands - technology hardly makes the game, and there's a good case that computing hardware has reached a "good enough" point of diminishing returns in the ways that it does. My PS4 is ten years old now and I still haven't seen anything on the next gen systems (themselves around four years old now!) that make me feel like I need to upgrade. I'm also of the opinion that various factors have made games-at-large increasingly less interesting over the past few years, but that's probably just me getting older and sadder.
> "Lode Runner Online: The Mad Monks' Revenge" at no. 35.
Lode Runner is a pretty popular game. I'm certainly not going to try to figure out their methodology, but I'm guessing Lode Runner volume sales peaked with this particular Win95 compatible version that was likely bundled with many computers that sold with Win95. The naming is a bit outlandish sounding now, of course, but it also plays in offline mode.
I wonder how many of the "best movies of all time" were shot on film, weren't edited with computers, and had an inflation-adjusted budget of under $50 million.
As my 6 year old makes his way through Fortnite Season 5, I'm impressed that his Switch still handles his no-scope sniper odyssey on these 8 year old Tegra X1s.
I've been playing Skyrim and Portal 2 on the Switch Lite and it keeps amazing me that games that my PS3 struggled with are now perfectly playable on a handheld device that has pretty good battery life and doesn't burn my hand. A device like this would have blown my mind 10 or 15 years ago (even more than the original PSP did the first time I saw one).
The PSP was amazing. The PS Vita - on a whole 'nother level, both performance-wise and controller improvement. It wasn't until the Switch came around in 2017 (purchased in 2018) that we realized HD (720p) gaming on a handheld machine. The fact that so many AAA games have been ported down to it is a testimony to its capabilities. Not a fan of nVIDIA in general, but they did an amazing job with this unit.
If one wants real PC gameplay without porting down, the Steam Deck starts at $400, add a MicroSD card for more storage.
Frankly, we still have nothing providing HD gaming on a handheld. The switch is way too big to be considered a handheld. I can't slip one into my pocket and go, like I could with a DS or 3DS. It's frustrating, because I love my 3DS for gaming on the go but there is still nothing which can replace it. Nintendo just gave up on the handheld market.
HD at 720p is more than enough for a 6" screen. The Switch is literally the only capable handheld that could do that and still does. It doesn't offer excellence, but it offers good enough. As already written, the Steam Deck replaced it a half a decade later. Not sure what your point is. Everything grows generatially in capability and capacity, including handhelds. I would not compare the Gameboy to the Gameboy Advance under such limitations.
I was just thinking that I’d love it if Nintendo made a Micro console again. I don’t feel like the Switch Lite was actually small enough to justify as a companion to the full-sized Switch. More just an alternative for those who don’t care about docked play.
But a pocketable Micro console would be awesome for those times I don’t want to have a full backpack with me, which the regular Switch requires.
The GBA micro was about as small as a usable console could get. The older I get, the less accepting of these limitations I've become. The Vita and PSP was a good median in twrms of practically.
The lifetime sales of the Vita and 3DS combined don't even beat out the sales of the Nintendo DS. The dedicated handheld market was killed by the same killer of portable media players and point-and-shoot cameras: the smartphone. Like the other markets, not by the smartphone being better, but by being good enough and always in your pocket.
What really blows me away is stuff like the z1 extreme and a laptop I have that is two generations behind that but has a 130w power envelope pushing 60-100fps AAA titles at 1440p - runs super hot but in a few years we may have handhelds doing something similar which I'm super excited for.
There is nothing that can be encompassed by a 220-300+ power envelope into a handheld for the next several years. Still, downgrading visual FX is one doable. Steam Deck 2 and Switch 2 are in the horizon. AMD v. Intel, as far as I'm concerned.
I'm sure they are, but it's much less noticeable than the handheld ports of things used to be. Having played many hours of Star Wars Battlefront II on PS2 and original Xbox, the PSP versions of Battlefront (II, Renegade Squadron, Elite Squadron) had some obvious compromises but were at least very similar to play. Then I tried the DS version of Renegade Squadron and the gameplay was almost unrecognizable from the others, which still disappoints me to this day.
Granted, I'm playing Skyrim and Portal 2 on Switch Lite so I can't see them on the TV and compare them at the same size as the PS3 versions of those games, but Skyrim at least seems to suffer much less lag and visual glitching than the PS3 version. I don't have any newer console to compare with so I'm happy!
My laptop runs Kbby Lake and Intel HD 620 and can't match the Switch's 30 fps perf. It's incredible what it can do. The 3070 that runs 220+W on my desktop is hard to replace ore minituarize.
There is a lot of space between a 3070 and a Switch. e.g. a few (I'd guess ~5) year old iPhones/iPads are considerably more powerful than the Switch. Unfortunately, it's just not a great platform for gaming.
Also keep in mind that iPhones and iPads were a lot more expensive than the Switch.
(Of course, Nintendo still sells (nearly) the same switch for nearly the same price, but a five your old mobile device would go for a lot less nowadays.)
I have some great memories working on the PSP. While you could get some FP precision issues if you weren't careful, I really liked being having BVH testing as part of the command buffer.
The vita sucked with no homebrew and the switch is another psp. You could run games off the pro duo or a microsd adapter but vita was locked down, slow and a bunch of lame remakes and psp titles. The PSP emulated up to the PS1.
By the time the switch was around people had been emulating on Android for a long time. They can do better graphics but there hasn’t been a game that made me feel like I needed 4K on handheld. The switch really is another golden age for gaming with the switch being hacked so quickly and having such good homebrew.
At least regarding the homebrew, I disagree completely. While it took a while for the vita to be thoroughly hacked, it has been thoroughly hacked. And the benefits are numerous: Using normal SD cards, expanding the the integrated psp hardware into a virtual psp, reformatting of the internal storage, extending it by replacing the 3G modem.
Meanwhile the switch had the big bootrom usb stack exploit, but everything apart from the original SoC doesn't have a publicly known easy exploit (there are mod chips, but nothing like the 3DS/PS3/Wii U/Vita/PSP/...). There also wasn't that as much "cool" stuff to do as on older consoles with homebrew due to the hardware simply being an android tablet with controllers (which doesn't make a difference as a console, but makes it more boring homebrew wise). So there is the usual stuff (savegames, different controllers, piracy, themes, overclocking), but nothing unique to the switch.
By the time it was thoroughly hacked, it was too late. Much better hardware was around. People could do most of what the vita was eventually able to do with a phone.
Besides piracy what do you expect from the switch? The PSP had an ebook reader, movie player and could play mp3s as well as other cool old games very early in it's release. It could play media and play games up to the ps1. Modding a PSP vita today is like maxing out a citron 2cv instead of buying an e scooter. When the PSP came out it was amazing. Now it's yesterday's news. I'm impressed by the hackers that did it but the vita just doesn't impress even with hombrew today for the capabilities.
The PSP had amazing battery life and felt like a better Gameboy advance at the time.
Yeah, hacking portable console was interesting and useful on PSP/DS era even for a script kiddy users, because there are not much affordable portable devices. Now we have a smartphone, so it's for who enjoy hacking and run emulator and/or pirated games.
What really killed the scene for video games was better devices. I got a PS4 that was hacked and there's not much homebrew on it. I would update it to play online since I don't do cool stuff with it but I don't even play it enough to justify that. Now you can get a cheaper better htpc and use that instead of something like xbmc on Xbox.
I wish the person I sold my fully-loaded 1/2 TB MicroSD Vita to a grand adventure with a pocketable gaming/homebrew/emulation/piracy machine. Nothing matches it in tis market given its capibilities. The Switch lite tried, but I ain't touching that crap without HDMI output and no mods available.
The Vita had a ton of amazing JRPGs, niche weeb stuff, visual novels and indie games. Bit of a weird lineup but for many years it was my most used system. These days the Vita has been cracked wide open and there's loads of homebrew.
Right now it's only the early Switch units that are hackable.
Did you have a PSP? The vita was such a downgrade from it and the vita like the PSP mostly had remakes, except the vita had psp remakes too. It had no killer game or multimedia capabilities by the time it released. I can't name an exclusive on there at all, I don't even know if it had one.
Yes lol. I had a PSP shortly after launch. I still have one. And yes, the Vita had games and exclusives. Most of its launch lineup was exclusives, Uncharted Golden Abyss, Wipeout 2048, Hot Shots Golf World Invitational, etc. Even for just playing PSP games, the Vita is better. OLED display+ the ability to remap the right stick to the D-pad or buttons. Makes games like MH: Freedom Unite waaay better.
And a lot of indie games, JRPGs, Visual Novels just fit the handheld form factor better than PC or PS4 even if they weren't exclusive.
The PS Vita predated the Switch by 6+ years... by the time emulation and CPU speed to match its requirements, it was already discontinued by Sony. It fulfilled its purpose.
The vita was a direct downgrade from a PSP if you valued fast loading times, emulation, homebrew and piracy. Even if you didn't pirate you gained the ability to rebuy your PSP games, and new expensive games with an OLED that had less battery life.
A downgrade from UMDs? Perhaps you're referring to the density of textures and newer, more demanding game engines, because the UMDs were such a pain. Trying hard not to get hyperbolic here.
The true testament to what can be handled on a mobile device these days is showcased on iOS devices. The iPhone 15 Pros can run a port of Resident Evil 7 at supposedly a pretty stable 30fps.
Unfortunately all of Apple's chips are stuck in Apple devices, where they're still struggling to incentivize developers to do ports as standard. So they're pumping out beautiful graphics to beautiful displays for games with shitty touch screen controls that are riddled with ads and/or are just glorified virtual casinos.
It's impressive, but it's also somewhat amazing to consider that the Tegra X1 is still about 3.5-4x as powerful as the GPU in the Raspberry Pi 5 - even if any iPhone chip since the A11 Bionic would beat the X1.
We’ll never see it, but I’ve wondered what a Switch would look like with an Apple chip.
Take something relatively modern but no longer too expensive, say an A15 like the Apple TV has, bump the screen to 1080p, and I bet it would scream. Possibly with better battery life.
Nintendo would never do that. I doubt Apple would either. But it would be a very interesting test.
It's a dynamic resolution in both docked and undocked modes since it appears to be mainly memory bandwidth limited. It does max out at 900p when docked though.
Can anybody answer why the author wrote "indirect branches tend to show up in object oriented languages"?
Given that branches are, well, just branches in any language, what makes OO so special?
Also further down the author states "while FP registers have to be wider to handle vector execution"
Again I'm pretty certain FP registers are larger owing to the greater precision they have, not specifically because they're designed for vector ops... please somebody explain why my understanding is wrong?
Virtual functions yield a lot of indirect branches. Virtual functions are a foundational part of object oriented designs.
FP registers have gotten much larger than the normal types that people store. e.g. 128, 256, 512-bit registers. A normal double floating point (pretty much the largest normally used floating point representation) occupies 64-bits, while a normal int64 occupies, unsurprisingly, the same 64-bits. But we're getting the mega registers specifically because there are a lot of multiple-four-singles at once, and so on, SIMD functions.
> Given that branches are, well, just branches in any language, what makes OO so special?
Indirect branches. This is a result of vtable indirections.
You are much more likely to encounter vtables in an object oriented language. Obviously, you can still have the same basic thing in a C program, e.g. SDL RWops, but in C++ for example, it's going to show up all over the place.
Indirect branches are common in many OO languages because calling object.method(arg) essentially does object.class.method(object, arg) or object.prototype.method(object, arg) - the address of method is loaded indirectly through the object's "class" field as it may be inherited or not.
(In some cases the compiler may statically know the class of an object, if it's not allowing for subclassing and a potentially overridden method)
Indirect branches are different from regular branches. A regular branch is 'goto label', while an indirect branch is like calling a function pointer or calling a virtual method
> Given that branches are, well, just branches in any language, what makes OO so special?
In principle you have calls to whatever method associated with a given object, which are indirect as there is not one statically-known target of the call, but in practise you can optimise a lot out <https://youtu.be/9epgZ-e6DUU?t=2543>.
> Also further down the author states "while FP registers have to be wider to handle vector execution"
> Again I'm pretty certain FP registers are larger owing to the greater precision they have, not specifically because they're designed for vector ops... please somebody explain why my understanding is wrong?
ARM Neon is both 128bit SIMD and the FPU for the system. There's not a separate FPU from the SIMD.
True. But the newer consoles in those cases were huge leaps over the older one.
ARM has 100% backwards compatible CPUs that Nintendo could use. They’ve done backwards compatibility a number of times now without needing to embed an entire extra CPU such as the Wii and the Wii U. I would be surprised if they weren’t thinking about it when they were designing the Switch.
The “embed the old console“ thing always works, but why go that far if there’s a much simpler option?
Well, not paying the ARM royalites is enough to motivate them to perform the jump. But Nintendo is japanese and softbank, ARM "owner", is japanese too, namely high level country interests would short circuit everything.
It is risc-v, there is a "compressed" byte code (but I would avoid it). It is more than open source: it is worldwide royalty free of an average modern ISA, and that...
The rumor I've heard is that there's a bug in the system crossbar which makes which core CCX you enable after reset the only choice you can make until the system is fully reset. That is, if you enable the A57 CCX, later enabling the A53 CCX triggers the bug and vis versa, even with the first CCX disabled when enabling the second.