dylanoconorkinja
DylanOConorKinja
dylanoconorkinja

Oh yeah, the near instant loading and stuff like Quick Resume, for sure - but that’s kind of my point. If we’re most impressed by ‘features’ like those, rather than performance improvements, it’s just hard for me to grasp the jump from 30 to 60 as being such a big deal.

Interesting; part of the thing that made me finally throw up my hands at the concept is that I get a Series X and don’t... really notice a difference. Which isn’t to say I don’t notice a difference between the two consoles at all - I went straight from an original Xbox One to a Series X, which means I’m also upgrading

To be fair, the entire point of the original comment was about the extreme positions, remember? I’m not saying you - or anyone who wants high frame rates - shouldn’t enjoy it, any more than I’m saying the guy who wants a $100 glass of wine shouldn’t enjoy his expensive hobby. But there’s also someone a little higher

I can see the gap; I just don’t find it a particularly wide gap. 

I hadn’t thought about the input device as a factor; that’s a good point!

Honestly, I think that’s a really good point about film and the benefits of 24fps - I’m just not sure that I agree with your ultimate conclusion that videogames, across the board, inherently benefit from the clarity of higher fps. Again, for ‘competitive’ gaming, especially high level competitive gaming? Sure,

Same; I can tell there is a difference between 30 and 60, but I’d honestly be hard-pressed to say that I preferred 60 over 30, and if so, why. And over 60, I feel like I just can’t see at at all.

Probably not! I do most of my gaming on consoles; my rare PC gaming is on an ancient laptop that wasn’t even top of the line when it was new. My exposure to high fps displays has been limited to high end televisions/monitors that other people had set up, in that ‘hey, come take a look at my new rig, isn’t it awesome!’

Cheers! And yeah, I guess that’s part of the notion I’m missing out on - so many of the testimonials on this thread (and elsewhere) have been really, really focused on the ‘negative’ part of the experience: ‘once you’ve played at 60, you can’t go back to 30! Once you’ve played at 90, you can’t go back to 60!’, but at

Yeah, as someone who’s primarily a console gamer, I... don’t get the appeal of displaying the frame rate on screen at all. Like, just from a gaming perspective, I tend to like games that minimize their HUD to the highest degree possible (bonus points if they can do it in a diegetic manner, like Dead Space), so I don’t

I definitely think a lot of it is that I’m primarily a console gamer (and the PC games I do play run more to the small-team indie variety, which, as you say, are rarely going to be designed as fps-taxing beasts). And the general consensus I’ve taken from this thread - and by ‘consensus’, I mean ‘the average point

Here’s the thing, though: I feel like it’s the ‘to me’ part in your post that kind of gets overlooked. Based on most of the replies I’ve received on this thread, the distinction between frame rates - especially in the higher margins - is only noticeable after you’ve been viewing those higher frame rates for some time.

Fair enough, but I’d say the difference is also that, in this metaphor, the people driving the nice cars don’t seem to understand that the beat-up Civic can still get you from point A to point B with no problems. Look at the rest of this thread; there are plenty of people going on about how anything under 60 is

Yeah, most of the responses I’ve seen so far have basically boiled down to ‘you don’t notice it until you’ve played on a higher frame rate for a while and then try to go back down’. Which is understandable, but also doesn’t really give me any incentive to want to play on high-end machines - for the same reason I don’t

Yeah, I would say the only distinction here is that some people get so up in arms over it... but chances are, somewhere else on the Internet, there’s an argument raging over whether or not cilantro actually tastes like toothpaste, as well. Arguing about subjectives as though they’re iron-clad truths is just a thing

Yeah, I very much have to kind of squint and look really, really carefully at that video before I notice it. Like, it’s definitely there, I’m not denying that, it’s just not the sort of thing where its lack would ever render a game ‘unplayable’ for me, or make me want to spend hundreds of dollars on equipment just to

Yeah, I think the notion that it’s something you kind of ‘train’ yourself to see - on purpose or otherwise - is what I’ve come around to here. I think of it a lot like wine tasting: for people who’ve never really been into wine, beyond ‘red, white, and rose’, there’s not a lot of difference between vintages. Then,

After reading through the responses here (which have been all across the board), what I’ve come down on is more or less that it’s a ‘trained’ thing: the more you play at higher frame rates, the more you’re going to notice higher frame rates. Much like the audiophiles in your example: the more they listen to vinyl, the

Yeah, I tend to think there’s a fair bit of that at play. Like, I’m not debating whether or not it’s ‘better’ - clearly, it takes more advanced hardware to run visually complex games at higher fps. I just don’t know that it makes sense as the be-all, end-all comparison it seems to get used as. (See: the number of

Thank you for being a perfect example of someone conforming to the stereotype I mentioned in the original post. Your attitude here is like ‘Exhibit A) of Needlessly Aggressive Support for High Frame Rates’.