Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It sounds like even the PS6 isn’t going to have an expressive improvement, and that the PS5 was the last such console. PS5 Pro was the first console focused on fake frame generation instead of real output resolution/frame rate improvements, and per the article PS6 is continuing that trend.


What really matters is the cost.

In the past a game console might launch at a high price point and then after a few years, the price goes down and they can release a new console at a high at a price close to where the last one started.

Blame crypto, AI, COVID but there has been no price drop for the PS5 and if there was gonna be a PS6 that was really better it would probably have to cost upwards of $1000 and you might as well get a PC. Sure there are people who haven’t tried Steam + an XBOX controller and think PV gaming is all unfun and sweaty but they will come around.


Inflation. PS5 standard at $499 in 2019 is $632 in 2025 money which is the same as the 1995 PS 1 when adjusted for inflation $299 (1995) to $635(2025). https://www.usinflationcalculator.com/

Thus the PS6 should be around 699 at launch.


When I bought a PS 1 around 1998-99 I paid $150 and I think that included a game or two. It's the later in the lifecycle price that has really changed (didn't the last iteration of it get down to either $99 or $49?)


In 2002 I remember PS1 being sold for 99€ in Toys'r'Us in the Netherlands, next to a PS2 being sold for 199€.


The main issue with inflation is that my salary is not inflation adjusted. Thus the relative price increase adjusted by inflation might be zero but the relative price increase adjusted by my salary is not.


The phrase “cost of living increase” is used to refer to an annual salary increase designed to keep up with inflation.

Typically, you should be receiving at least an annual cost of living increase each year. This is standard practice for every company I’ve ever worked for and it’s a common practice across the industry. Getting a true raise is the amount above and beyond the annual cost of living increase.

If your company has been keeping your salary fixed during this time of inflation, then you are correct that you are losing earning power. I would strongly recommend you hit the job market if that’s the case because the rest of the world has moved on.

In some of the lower wage brackets (not us tech people) the increase in wages has actually outpaced inflation.


Thank you for your concern but I'm in Germany so the situation is a bit different and only very few companies have been able to keep up with inflation around here. I've seen at least a few adjustments but would not likely find a job that pays as well as mine does 100% remote. Making roughly 60K in Germany as a single in his 30s isn't exactly painful.


> but would not likely find a job that pays as well as mine does 100% remote.

That makes sense. The market for remote jobs has been shrinking while more people are competing for the smaller number of remote jobs. In office comes with a premium now and remote is a high competition space.


If you want to work 100% remote you could consider working for a US company as a consultant?


If a US company hires you in Germany, either you get hired by their German branch or a personnel service provider based in Germany; and thus get paid "competitive" salaries typical of the country. Or you need to have some kind of setup where you are a freelancer or such and you figure out the taxation and statutory insurances and such on your own, which I'm not familiar with (my freelance IT consultancy side-business is rather simple because of small scale and only domestic customers). That will probably work, and if you manage to get a senior Silicon Valley salary, you would probably come out ahead by a bit after taxes and insurances. But you would probably need good tax advisors to avoid stepping in expensive loopholes, and if you work more than 80% for a single employer, the tax administration will be on your case because false self-employment is a possible method of tax evasion and has been outlawed.


If the client you're consulting for has no presence in Germany then it cannot possibly be false self-employment, surely?


False self-employment is judged solely by whether you spend 80% or more of your worktime working for a single employer.


Typically "Cost Of Living" increases target roughly inflation. They don't really keep up though, due to taxes.

If you've got a decent tech job in Canada your marginal tax rate will be near 50%. Any new income is taxed at that rate, so that 3% COL raise, is really a 1.5% raise in your purchasing power, which typically makes you worse off.

Until you're at a very comfortable salary, you're better off job hopping to boost your salary. I'm pretty sure all the financial people are well aware they're eroding their employees salaries over time, and are hoping you are not aware.


Tax brackets also shift through time, though less frequently. So if you only get COL increases for 20 years you’re going to be reasonably close to the same after tax income barring significant changes to the tax code.

In the US the bottom tax brackets where 10% under 2020 $19,750 then 12% next bucket, in 2025 it’s 10% under $23,850 then 12% next bracket. https://taxfoundation.org/data/all/federal/historical-income...


And here I am in the UK, where the brackets have been frozen until 2028 (if they don't invent some reason to freeze further).


Freezing tax brackets is a somewhat stealthy way to shift the tax burden to lower income households as it’s less obviously a tax increase.


Is your salary the same as 10 years ago?


Those in charge of fiat printing presses have run the largest theft or wealth in world history since 1971 when the dollar decoupled from gold.


Cash is a small fraction of overall US wealth, but inflation is a very useful tax on foreigners using USD thus subsidizing the US economy.


But now you’re assuming the PC isn’t also getting more expensive.

If a console designed to break even is $1,000 then surely an equivalent PC hardware designed to be profitable without software sales revenue will be more expensive.


You have to price it equivalent grams of gold to see the real price trend


Says who?

Economists use the consumer price index, which tracks a wide basket of goods and services.

Comparing console prices to a single good is nonsense, even if the good has 6000 of years of history, it's not a good comparison to a single good in a vacuum.


PCs do get cheaper over time though, except if there is another crypto boom, then we are all doomed.


"PCs do get cheaper over time though"

pc get cheaper but the gpu isnt


A GTX1050ti was $139 9 years ago. Getting a Ryzen 8700G instead of a 8700F gives you more and costs you $39 today!


no way it cost 39$ more also inflation exist


As long as I need a mouse and keyboard to install updates or to install/start my games from GOG, it's still going to be decidedly unfun, but hopefully Windows' upcoming built-in controller support will make it less unfun.


Today you can just buy an Xbox controller and pair it with your Windows computer and it just works and it’s the same same with the Mac.

You don’t have to install any drivers or anything and with the big screen mode in Steam it’s a lean back experience where you can pick out your games and start one up without using anything other than the controller.


I like big picture mode in Steam, but.... controller support is spotty across Steam games, and personally I think you need both a Steam controller and a DualSense or Xbox controller. Steam also updates itself by default every time you launch, and you have to deal with Windows updates and other irritations. Oh, here's another update for .net, wonderful. And a useless new AI agent. SteamOS and Linux/Proton may be better in some ways, but there are still compatibility and configuration headaches. And half my Steam library doesn't even work on macOS, even games that used to work (not to mention the issues with intel vs. Apple Silicon, etc.)

The "it just works" factor and not having to mess with drivers is a huge advantage of consoles.

Apple TV could almost be a decent game system if Apple ever decided to ship a controller in the box and stopped breaking App Store games every year (though live service games rot on the shelf anyway.)


> [...]controller support is spotty[...]

DualSense 4 and 5 support under Linux is rock-solid, wired or wireless. That's to be expected since the drivers are maintained by Sony[1]. I have no idea about the XBox controller, but I know DS works perfectly with Steam/Proton out of the box, with the vanilla Linux kernel.

1. https://www.phoronix.com/news/Sony-HID-PlayStation-PS5


I have clarified that I meant controller support in the Steam games themselves. Some of them work well, some of them not so well. Others need to be configured. Others only work with a Steam controller. I wish everything worked well with DualSense, especially since I really like its haptics, but it's basically on the many (many) game developers to provide the same kind of controller support that is standard on consoles.


Thanks for the clarification. I've into that a couple of times - Steam's button remapping helps sometimes, but you'd have to remember which controller button the on-screen symbol maps to.


Are you sure you have steam configured right? Because with steam input you can get proper xbox controller emulation on games that don't support PS4/5 and NS controllers. It's not perfect but you should never be stuck if you don't have an xbox or steam controller when running games inside Steam.


Lots of games on Steam simply don't have great (or really any) controller support. Steam controller can sort of play some of them though since it can emulate mouse + keyboard etc.

My experience with Steam Input is ... OK in some cases. It's annoying that it seems to break games that actually do support the DualSense properly (though full haptics only work in wired mode) like FFXIV.


But when I have to install drivers, or install a non-Steam game, I can't do that with the controller yet. That's what I need for PC gaming to work in my living room.


Or you just need a Steam controller. They're discontinued now but work well as a mouse+keyboard for desktop usage. It got squished into the Steam Deck so hopefully there's a new version in the future.


If you have steam, ps4/ps5 controllers also work fine.


They do not work fine in every game. That is why I think you need a Steam controller as well.


They do but they cost a lot more.


My ps5 came with one for “free”


Plus add your GOG games as non-Steam games to Steam and launch them from big screen mode as well.


Launch Steam in big screen mode. Done.


I'm aware of Big Picture Mode, and it doesn't address either of the scenarios I cited specifically because they can't be done from Big Picture Mode.


How many grams of gold has the PS cost at launch using gold prices on launch day


If I'm doing this right, then:

PS1: 24.32 grams at launch

PS5 (disc): 8.28 grams at launch

(So I guess that if what one uses for currency is a sock drawer full of gold, then consoles have become a lot cheaper in the past decades.)


Im still watching 720p movirs, video games.

Somewhere between 60 hz and 240hz, theres zero fundamental benefits. Same for resolution.

It isnt just that hardware progress is a sigmoid, our experiential value.

The reality is that exponential improvement is not a fundamental force. Its always going to find some limit.


On my projector (120 inch) the difference between 720p and 4k is night and day.


Screen size is pretty much irrelevant, as nobody is going to be watching it at nose-length distance to count the pixels. What matters is angular resolution: how much area does a pixel take up in your field of vision? Bigger screens are going to be further away, so they need the same resolution to provide the same quality as a smaller screen which is closer to the viewer.

Resolution-wise, it depends a lot on the kind of content you are viewing as well. If you're looking at a locally-rendered UI filled with sharp lines, 720p is going to look horrible compared to 4k. But when it comes to video you've got to take bitrate into account as well. If anything, a 4k movie with a bitrate of 3Mbps is going to look worse than a 720p movie with a bitrate of 3Mbps.

I definitely prefer 4k over 720p as well, and there's a reason my desktop setup has had a 32" 4k monitor for ages. But beyond that? I might be able to be convinced to spend a few bucks extra for 6k or 8k if my current setup dies, but anything more would be a complete waste of money - at reasonable viewing distances there's absolutely zero visual difference.

We're not going to see 10.000Hz 32k graphics in the future, simply because nobody will want to pay extra to upgrade from 7.500Hz 16k graphics. Even the "hardcore gamers" don't hate money that much.


Does an increased pixel count make a bad movie better?


Does a decreased pixel count make a good movie better?


> Im still watching 720p movirs, video games.

There's a noticeable and obvious improvement from 720 to 1080p to 4k (depending on the screen size). While there are diminishing gains, up to at least 1440p there's still a very noticeable difference.

> Somewhere between 60 hz and 240hz, theres zero fundamental benefits. Same for resolution.

Also not true. While the difference between 40fps and 60fps is more noticeable than say from 60 to 100fps, the difference is still noticeable enough. Add the reduction in latency that's also very noticeable.


Is the difference between 100fps and 240fps noticeable though? The OP said "somewhere between 60hz and 240hz" and I agree.


Somewhere between a shoulder tap and a 30-06 there is a painful sensation.

The difference between 60 and 120hz is huge to me. I havent had a lot of experience above 140.

Likewise, 4k is a huge difference in font rendering, and 1080->1440 is big in gaming.


4K is big but certainly was not as big a leap forward as SD to HD


That would be very obvious and immediately noticeable difference but you need enough FPS rendered (natively not with latency increasing frame generation) and a display that can actually do 240hz without becoming a smeary mess.

If you have this combination and you play with it for an hour and you go back to a locked 100hz Game you would never want to go back. It's rather annoying in that regard actually.


Even with frame generation it is incredibly obvious. The latency for sure is a downside, but 100 FPS vs 240 FPS is extremely evident to the human visual system.


> Is the difference between 100fps and 240fps noticeable though?

Yes.

> The OP said "somewhere between 60hz and 240hz" and I agree.

Plenty of us dont. A 240hz OLED still provides a signifacntly blurrier image in motion than my 20+ year old CRT.


Surely that 20+ year old CRT didn't run at more than 240Hz? Something other than framerate is at play here.


> Surely that 20+ year old CRT didn't run at more than 240Hz?

It didnt have too.

> Something other than framerate is at play here.

Yes, sample and hold motion blur, inherent to all modern display types commonly in use for the most part.

Even at 240hz, modern displays can not match CRT for motion quality.

https://blurbusters.com/faq/oled-motion-blur/


Lower latency between your input and its results appearing on the screen is exactly what a fundamental benefit is.

The resolution part is even sillier - you literally get more information per frame at higher resolutions.

Yes, the law of diminishing returns still applies, but 720p@60hz is way below the optimum. I'd estimate 4k@120hz as the low end of optimal maybe? There's some variance w.r.t the application, a first person game is going to have different requirements from a movie, but either way 720p ain't it.


Really strange that a huge pile of hacks, maths, and more hacks became the standard of "true" frames.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: