Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think there are a few factors that are likely to slow the pace down quite a bit soon:

1. We realistically aren't going much past 4k anytime soon. Even the few 8k sets on the market are sort of tech demos, because there isn't really the content for it. Maybe 120/240/etc FPS will be a thing, but that's really a linear growth, not exponential, and it has a pretty short path to go (will 500Hz displays ever become a big seller? 1kHz?)

2. The triple-A games market itself is strained-- too many big-money flops, and those are the ones that have historically substituted more triangles for better storytelling.

So you're going to reach a point where the hardware isn't really limiting the designers' visions anymore.

GenAI seems like a questionable approach for game rendering, both because of inefficiency and non-repeatability. If the AI renders the same scene slightly differently on two machines, it could cause bugs or unfair competitive edges. At most, we'd see AI during the development process to build assets, and that doesn't require a bigger local GPU.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: