With all three major console manufacturers prioritizing backwards compatibility, and the rise in PC gaming (universally backwards compatible), people are starting to catch on to the fact that old games don't "expire" after 10 years. I wouldn't be surprised if backwards compatibility just becomes the standard for all gaming consoles going forward.
Tangential, but I'm also interested in seeing how games that released on old consoles and are continued to be played, like Fortnite, will support aging hardware. I don't like that Epic can one day announce the game just no longer works on that console, rendering your purchases null and void until you upgrade your hardware, but I can't expect them to update that version of the game forever.
That hardware can no longer compete with platforms that don't throw away their entire library on every release is probably one of the first impacts of games finally maturing. My "next console" was a Steam Deck for partially this very reason, the fact that it came preloaded with years of previous acquisitions.
We're also just seeing the leading edge of the game industry having to deal with the fact that it now has to compete against itself. There's been a number of articles about how $NEW_GAME never even reached a peak player count of something like Skyrim. I think that's currently being written as a sort of a "ha ha, that's sorta funny", but it represents a real problem. It is not unsolvable; Hollywood has always faced this issue and it has historically managed to make money anyhow. But I think AAA gaming is only just beginning to reckon with the fact that they aren't going to get a "free reset" on every console generation. $NEW_GAME really is is competition with Skyrim now, along with a lot of other things. It's not a joke, it's an emerging reality the industry is going to have to grapple with.
It's harder for video games, because movies take only a couple hours to finish, and you generally want something you haven't seen before each movie night. Video games can take hundreds of hours to play to completion, and some games you can enjoy replaying tens of thousands of times. So the competition from the existing games library is very tough.
I completely agree but I would actually extend this principle even more aggressively. Even if, for whatever reason, we were hard capped technologically at Windows 98, even that space could be fruitfully explored practically without end, creating new genres, new stories, new games.
Fiction writing carries on just fine in books, and music has certainly benefited from new tech and new methods but there would always be music even if that weren't the case, and same with cinema. I would put tabletop games in this category too. Its continued future viability, independent of future tech advancements may be an important factor in settling whether its art.
Full credit to Nintendo for recognizing they had plenty of unused creative space to play in, and choosing to play by different rules.
I've said before that I've got a list of games going back 20-odd years that I'd like to play through in retirement, so I'm not the target market, but for online multiplayer games there needs to be a player base that makes it worthwhile, and the swarms are fickle and fast-moving. Helldivers 2 being a recent example to where a large community swarmed.
Having said that, and as someone else pointed out, enduring games like Fortnite will have to cut off certain aging hardware at some point if it's to remain a viable magnet to the swarms.
Aside: I used to go to LANs back in the Quake2 days, and was annoyed with Counterstrike because it essentially halved the player pool of Q2 FFA fragfests. The fragmentation of the market has only continued since then, but the market has also greatly increased in size. I did very much enjoy the unchained chaos of large scale Q2/Q3 FFA and Rocket Arena. Good times.
I don't think there's ever been a console generation before where the last generation was still getting big new releases this deep into the next one. The PS5 Pro is out now and the PS4 is still getting new games.
An analogy I might draw is the FIFA games, where FIFA 14 came out on the PS4 and PS3, but also the PS2 and Wii, which were just roster updates of previous years (no new gameplay features whatsoever), and clearly that was acceptable to enough people to give EA the trouble of developing, printing and distributing.
And people want to see that specific one NEW movie, too, not even just "the young". Even now, after all that has happened, Hollywood can still put butts in theater seats for a new movie, even though the attendees probably average several dozen movies at home and probably still have literally hundreds of movies they would enjoy as much or more than the one they are watching in the theater. A lengthy essay could be written on why, which I'll let someone else write.
But I can promise you from personal experience that a 2024 gamer has an easier time picking up and enjoying a 2014 game than a 2004 gamer would have picking up a 1994 game, to the point that it is not even close.
Checking a list of games from 2014... heck, I've got personal proof, my young teen recently started Shadows of Mordor. While it didn't "stick" (we got Skyrim somewhat after that and that has stuck, however, while initial release is 2011 on that the history is complicated and I won't complain if someone wants to forward-date that at least a bit), he wasn't like "oh my gosh this looks so bad and the QoL is so terrible I can't play this anymore". Others from 2014 include Super Smash Bros Wii U, Assassin's Creed IV: Black Flag, and The Last Of Us: Left Behind. Really not that dissimilar from what is being put out today.
Whereas 2004 to 1994 is the delta between Grand Theft Auto - San Andreas and Sonic 3 and Knuckles. That's huge. Yes, I'm old enough to have been there and I can you from personal experience that in 2004 "Sonic 3 and Knuckles" was very definitely legacy in a way that The Last Of Us: Left Behind is not. If you tell someone today that you just started the latter, they might wonder why you're late to the party but they're not going to think anything more of it.
I'm sure D4 is more modern, but the difference from D3 is nowhere near D2->D3 for the same time span (12 years).
It's hard to release new live-service games too. Many people will just be happy to play LoL for the rest of their lives.
But that compatibility is not achieved with emulation, right?
The PS6 can hopefully keep compatibility with PS5 and PS4 in a similar way. Unless we are nearing some sort of ARM horizon for consoles, that is.
The documents accidentally leaked from the FTC vs. Microsoft trial revealed that Microsoft was at least considering switching to an ARM CPU with the next Xbox generation, but they hadn't decided yet at the time those documents were written. Either way they would still use an AMD GPU, so it would be AMD+AMD or ARM+AMD.
A platform inside the platform.
That is why console sales are so bad, in comparison with previous generations growth sales.
The young I know play free mobile games they downloaded from clickbait ads.
That's not true at all, many games don't bother with the Switch at all because of dev costs, and Fortnite, one of the most popular games in the world, is struggling on the Switch. I know because I play FN on Switch occasionally, and you can quite literally see all the pain that went into making all that complexity work at approx 25fps.
Even Nintendo can't make the latest Zeldas run at >30fps, and they're relatively low fidelity.
I've been organizing LAN parties with my friends for 26 years now and around 2010 to 2016 was the time when games became so good that stopped making sense to upgrade in-between LAN parties.
- Left 4 Dead 2
- Killing Floor 2
- CS:GO
- Grid 2
- GTA V
- StarCraft II
plus nowadays there's stiff free competition, e.g.
- Rocket League
- Brawlhalla
- Dota 2
- LoL
but also from OpenRA, which modernizes Red Alert.
Plus, it's challenging to tell based on screenshots if you're looking at Assassin's Creed III (from 2012) or Assassin's Creed Mirage (from 2023) and there's been 7 !!! other Assassin's Creed games in between.
And looking at the Switch, I'd say the situation for new games is brutal. There's lots of evergreen games with great replay-ability and thanks to the cartridges you can easily borrow them among a group of friends. It's been a while since I last bought a new one because there just wasn't anything different enough from what I already have and like.
My biggest wish for the Switch has been that it'll one day drive my screen at 144Hz to make movement smooth. And it looks like Nintendo is going to deliver exactly that: More powerful hardware for the same old games.
I wonder if Nintendo will also eventually be forced to implement a subscription model and/or if they will start to aggressively push older games without updates out of their store (like what Apple does) because otherwise I just don't see many openings for developers to build a new Switch game and make the financials work. Currently, you're competing with a back catalogue of 4,747 games, so good luck finding anything where you can stand out by being better.
> - StarCraft II
I thought Starcraft II didn't allow LAN play?
So yes, technically not a LAN game, but in practical terms any modern LAN party also probably has internet. It's not the hurdle it used to be.
Consoles used to have very bespoke architectures, but now are switching to customized versions of relatively off-the-shelf components. Both the PS5 and the last XBox use x86 AMD CPU+GPU combos, probably a variation of their regular G product line.
The games on the Wii might have been super novel, and innovative, but most of them were kind of junk that wouldn't pass today. Now most new games seem to come with 100+ hours of content and extremely polished gameplay. Rather than building 4 games for 4 platforms, you can spend 4x more to develop one game.
This was something that confused me about the concept of consoles in the 90s. The nonexistent value proposition of a console hasn't changed since then.
I assume they serve two purposes:
(1) They're marketed as toys you might buy for someone as a gift.
(2) You might own a console if you don't want to own a computer.
Purpose (2) seems to have withered and died.
> There's been a number of articles about how $NEW_GAME never even reached a peak player count of something like Skyrim. I think that's currently being written as a sort of a "ha ha, that's sorta funny", but it represents a real problem. It is not unsolvable; Hollywood has always faced this issue and it has historically managed to make money anyhow.
One major aspect of copyright law is making it difficult for people to consume media from the past.
(4) Don't have money for computer (there is a lot of overlap here, a PC may or may not be cheaper in cases for a given perf level)
(5) Gift bought by non tech-savvy family member
(6) Do own a computer, but just want a different and more plug and play device to relax with after staring at said computer for 10 hours a day
Traditionally for these "Live Service"-type games, they announce cutting support for a console, but let you carry your purchases in that specific game (subscription, add-on items, etc), forward to the same game on the next gen of that console.
For example, how Final Fantasy 14 ended PS3 support - https://www.gamedeveloper.com/game-platforms/-i-final-fantas... and how Grand Theft Auto 5 ended PS3 support - https://www.ign.com/articles/gta-online-support-ending-xbox-...
It's not a guarantee, but I'd expect something similar for Fortnite.
Backwards compatibility is very "cheap" these days though? With no arcane architectures and chip designs. PS5 and Xbox are basically just generic PCs running a restricted OS and Switch is just a phone/tablet.
If the GPU access is through a relatively "thick" API like DX/Vulkan and shaders stored in an intermediate representation like DXIL or SPIR-V, sure, swapping out the hardware implementation is relatively easy.
But if they're shipping GPU ISA binaries as the shaders, you'll have a much harder time ensuring compatibility.
Same with things like synchronization, on both the CPU and GPU (and any other offload devices like DSPs or future NPUs). If they use API-provided mechanisms, and those are used /correctly/, then the implementation can likely be changed. But if they cycle-count and rely on specific device timing, again all bets are off.
Things like DX12 and Vulkan have a large number of sync points and state transition metadata to allow different implementations to be "correct" in things like cache or format conversions (like compression). Not all those transitions are required for every vendor's hardware, and we regularly see issues caused by apps not "correctly" using them when the spec says it's required, as the vendor's hardware they happened to test on didn't require that one specific transition that another implementation might, or they happened across some timing that didn't happen to hit issues.
I guess my point is Compatibility is hard even if the APIs are intentionally designed to allow it. I have no idea how much the idea of such compatibility has been baked into console APIs in the first place. One of the primary advantages of consoles is to allow simplifications allowed by targeting limited hardware, so I can only assume they're less compatibility focused than the PC APIs we already have Big Problems with.
When apple switched to ARM even with x64->ARMv8 translation layer (NOT emulating) it was still noticeably slow in a lot of software. Even though some x64 games worked on ARM macs they still lost A LOT of performance.
The backwards compatibility of the PS2 was due to the PS2 literally including an extra PS1 CPU (technically PS1-like CPU underclocked to match the original PS1 CPU when running PS1 games). On PS2 games this PS1 CPU handled only I/O so it wasn't completely wasted when running PS2 games.
https://en.wikipedia.org/wiki/PlayStation_2_technical_specif...
The PS2 CPU is a MIPS III while the PS1 CPU is a MIPS I. I am not an expert but I think but I think MIPS III is only backwards compatible to MIPS II, not MIPS I
Isnt it pretty much just the Wii and Wii U? I guess you could play GameCube disks on a Wii but calling the Wii a modernized version of the GameCube is a real stretch.
New 3DS crying in the corner because it didn't even get a side mention, which about matches the number of exclusives it had.
I've always wondered how true this is — I feel like if it was literally true, we'd see a lot of NES ROMhacks that involve editing the ROM's layout and metadata bits just enough that it's now a SNES ROM, and then taking advantage of SNES capabilities in the mod. But I don't believe I've ever seen something like that.
I do understand that the SNES CPU is basically a "very extended" 6502; and that the SNES PPU's default-on-boot graphics mode is compatible with drawing NES-PPU-formatted CHR-ROM data; and that there's a "legacy compatibility" joypad input MMIO in the right place in address space to allow a game that was programmed for the NES to read the "NES subset" of a SNES controller's buttons.
But is the SNES's (variant) 65C816 ISA a strict superset of the NES's (variant) 6502 ISA? Or would they have had to effectively go through the assembly code of SMB3 with a fine-toothed comb, fixing up little compatibilities in the available instructions here and there, to get it to run on the SNES?
(Though actually, even if they did have to do that, I imagine it would be still be possible to automate that process — i.e. it would be theoretically possible to write a NES-to-SNES static transpiler. In fact, it's so seemingly-tenable, that I'm a bit surprised to have never heard of such a project!)
The graphics chip was even fixed-function, like the Gamecube's, not shader-based like the Xbox 360 or PS3.
The graphics architecture was even the same between Wii and GameCube - ATI's Flipper, just with 50% higher clocks on the Wii.
We grew from the 8 bit home computers, lived through 16 bit home computers and settled in PC gaming.
Nintendo was mostly about those game & watch handhelds, naturally SEGA and PlayStation became relevant, replaced by XBox and PlayStation, but always on the shadow of PC gaming.
Microsoft and Sony have demonstrated that hardware security can be more or less perfected, neither of their systems have been compromised via hardware attacks for several generations now.
[0] https://www.gamesindustry.biz/unpatchable-hardware-exploit-l...
[1] https://gbatemp.net/threads/scanned-drilling-template-16d4s-...
The Xboxes have held up extremely well on the software front as well, and although the Playstation software isn't so robust (they use FreeBSD and routinely get owned by upstream CVEs) their secure boot has never been broken, which limits how much you can do with a software jailbreak. PS3 jailbreaks had continuity where you could upgrade an exploitable firmware to a non-exploitable one while retaining a backdoor, but the PS4s secure boot put an end to that.
And for Rust fans, its firmware has been rewriten.
Perceived failure of the Wii U and the total reboot of the Switch project itself: https://mynintendonews.com/2020/12/22/nintendo-leak-shows-sw...
A decade ago the engineers designing these chips knew there were several angles of attack but there just wasn’t enough resources put into closing these holes.
Now every know angle of attack is closed. Even if you delid the chip and reverse engineer every single gate and can probe individual metal wires on the chip, it’ll still be nearly impossible to break the hardware security. Power supply and EM glitching is also protected against (can’t speak for Switch 2 but I’m speaking in general about chips going forward)
Could be bugs and mistakes that allows someone to bypass security, of course. Both in hardware and software. But I don’t think there will be general purpose angles of attack that can be used to bypass security going forward.
Microsoft talked openly about implementing those safeguards in the Xbox One, and they've held up for a decade or so now.
Do you mean that the protection on the firmware gets refreshed with updates, but the secret it protects always stays the same?
Even if the software is absolutely bulletproof, you can hack almost everything by modifying the hardware. Cutting the power of the CPU for a tiny amount of time for example can cause it to glitch in a way that bypasses the security checks. This is accessible enough for at least one person to get in and dump games.
When it comes to video games. That's not much of a demonstration in the grand scheme of things.
I think Nintendo has a case to make that Switch emulation is costing them real money.
As for people choosing an emulator over buying a Switch: too bad, that's how competition works.
There are legal issues around how to legally obtain emulatable copies of the games you own, but emulation is absolutely legal.
(This is not a commentary on whether the emulators in question were careful in every other way.)
Still, Nintendo's motive is to defend their IP.
Even if the lawsuits go nowhere, it still works for them.
See, https://www.pcgamer.com/gaming-industry/switch-emulator-ryuj....
If Nintendo's IP wasn't involved they wouldn't give a rat ass about the emulation scene.
https://en.wikipedia.org/wiki/List_of_copyright_case_law
https://en.wikipedia.org/wiki/List_of_patent_case_law
https://en.wikipedia.org/wiki/List_of_trademark_case_law
https://en.wikipedia.org/wiki/List_of_United_States_patent_l...
https://en.wikipedia.org/wiki/List_of_United_States_Supreme_...
https://en.wikipedia.org/wiki/List_of_United_States_Supreme_...
https://en.wikipedia.org/wiki/List_of_United_States_Supreme_...
Did you not know this?
I believe emulation is legal in the US.
This is how Game Boy Color, DSi and 3DS systems handled being able to accept games from older models worked.
- smaller
- energy efficient
- cost saving
and they are all valid reasons, it's a handheld, the form factor will evolve until perfected
For cost, they could likely reduce the pincount for new cartridges, by changing the number of data pins, but that doesn't preclude using the same slot. Reducing cost of cartridges is more effective than reducing the cost of the console. Reducing pin count would probably save more money than shrinking the small amount of plastic case.
For energy efficiency, maybe they can eliminate 3.3v and only keep 1.8v for new carts, maybe redesign the insertion detection pins to detect old and new.
Nintendo also seems to be the least price gouge-y, in terms of lootboxes and microtransactions and other bullshit. Now I wish that didn't come with the tradeoff of them being completely anal when it comes to people posting OSTs online but I guess I'll take it.
Gameboy Color supported OG Gameboy games
GBA supported GBC games
DS supported GBA and(?) GBC games - Could be wrong about that
3DS supported DS games.
The 3DS also had games from other consoles for sale in the eShop, but they were emulated (GB, GBC, Game Gear, NES, SNES). If you bought a 3DS before the price drop, you could also play some GBA games. These are also running natively, not emulation https://en-americas-support.nintendo.com/app/answers/detail/...
Now we've arrived at a fairly locked in set of architectures.
Bonus if they invent an AI that can fix the crash bugs in the binary.
https://www.eurogamer.net/digitalfoundry-2023-inside-nvidias...
The basis for the rumour is basically Linux kernel code and other leaks/hacks for a "T239" SoC that seemingly has all the streamlining and features you'd want for a mobile gaming processor (as opposed to a automotive SoC like the T234 it's supposedly derived from).
The Samsung fab is based on T234 being fabbed by Samsung using a ~5 year old process, and Korean industry rumours (https://m-mk-co-kr.translate.goog/news/business/10999380?_x_...).
Even if they don't need that money, it's still good to deny the competition of such a lucrative contract.
Presumably it will reduce their current gross margins (which won't necessarily look great in their quarterly report. Nvidia's total revenue is only ~20% higher than Intel's was back in 2021 despite the insane valuations (in large part due to their obscene margins).
Fourth time lucky?
(Poor ol' Nvidia has had an unfortunate history with this, arguably largely through no fault of their own. The Zune, the Kin with Tegra 1, the Motorola Xoom with Tegra 2, a variety of less-beloved tablets and weird phones with Tegra 3. I think the only successful use-case besides Nintendo and car infotainment stuff was Nvidia's own Shield.)
This also means that the Switch SoC doesn't use an expensive cutting edge manufacturing process. And it probably won't be made in TSMC factories at all. Leaks pretty clearly indicate an Nvidia Ampere based SoC built on Samsung's 8nm process, so it's the same tech as Nvidia's consumer line circa 2020.
Which would mean the SoC is even more outdated than the Switch 1 SoC was at launch. Reason is probably that Nintendo originally wanted to release the new hardware significantly earlier.
I really don't understand why they are planning those chips apparently many years in advance, when some other manufacturer (AMD, Qualcomm, Intel, MediaTek) could have supplied a more modern SoC without many modifications in a relatively short timeframe at a better price than Nvidia.
This would have made backwards compatibility more difficult, but I don't think this is that big of an issue anyway. Nintendo often didn't have it in the past, and few people complained. After all, old games can still be played on the old hardware.
- Almost every first-party multiplayer Nintendo game on the Switch that I know of has offline local multiplayer. The only exception which comes to mind is Splatoon.
- The Switch has a cartridge slot, and leaks suggest the Switch 2 will too.
- And you can connect two (possibly more with a hub) Pro controllers with a true wired connection: https://en-americas-support.nintendo.com/app/answers/detail/...
Fingers crossed that the Switch 2 maintains this pattern.
It does, but it's hidden behind an unlisted button combination (Zl + Zr + L3 for Splat 3) and every player needs their own console and copy of the game: https://splatoonwiki.org/wiki/The_Shoal#LAN_Play
I am a little disappointed they don't have anything like the DS's download play feature though.
- Battery life isn’t really a problem on full-sized controllers (and the failure modes are “walk the dog around the block while it charges enough for a couple-hour session” or “it becomes a wired controller for a few minutes”) including the Nintendo ones, just the damn joy-cons. Those do suck, but the basic idea of wireless controllers has proven to be really good, not like the old Wave Bird days.
- The Switch is easily the best local multiplayer modern console AFAIK, including lots and lots of co-op options.
Many games were not ported to it because it used a cartridge that couldn't hold near the data of a CD ROM like its peers.
The controller was amazing though.
...
What?
The thumbstick was super shoddy and was prone to mechanical failures, the ridiculously tiny d-pad was literally made for ants. The N64 was a lot of things, but I don't know anyone who's giving out accolades for the controller design.
The GC controller (outside of the HUGE shoulder bumpers that were used as analog in a grand total of like 4 racing games) was a vast improvement on it, and I would say that the Switch Pro Controller ranks up there as one of Nintendo's best though the cost of $60/$70 kind of stung.
Con: Assuming native compatibility, this likely won’t be a very exciting console.
Hopefully Nintendo learned its lessons from the Wii U.
>Hopefully Nintendo learned its lessons from the Wii U.
That’s my concern, Nintendo doesn’t like incremental titles like “Switch 2”. They’d rather call it something weird like “Switch Me” which only confuses non informed customers.
To be fair, I predict a Netflix of gaming in the future so maybe this is a safe move, idk.
I agree that the Switch 2 will likely be "more of the same", but I don't really see how that relates to back-compat?
I guess I was under the assumption that because on the joy cons unique format, it would be hard to escape with fully compatible support. But I didn’t own a Switch for very long so idk if that’s true.
Also, I think the $700 PS5 Pro wants a word with you.
People understand Playstation 1, 2, 3, 4, 5 just fine so that simply isn't true.
Also consumer confusion is not a good excuse to ignore having backwards compatibility.