My Joel didn’t kill a single doctor. Or soldier, besides the one in the cutscene who is clearly going to kill you.
My Joel didn’t kill a single doctor. Or soldier, besides the one in the cutscene who is clearly going to kill you.
Right? He was also obviously about to be executed by the Fireflies for his trouble. At a minimum you’d expect him to defend himself. They kinda screwed up too.
I exceeded my internet snarkiness limit on my comment, so perhaps it is I who am the judgy one! You got a few people bristling at your comment though!
Precisely how many seconds do they have to look at the scenery to get your stamp of approval? Is it a gradient, where as they approach that number you become less “bothered?”
It looked cool and the designers didn’t care about realism is the real answer to your comment.
Me too. I wasn’t even near the fence until yesterday so that’s good. I feel like, though I don’t truly remember, that XII and XIII had similar reviews that had lots to say about flaws but downplayed them. In fact I want to say those games had even better reviews at first. But maybe XV gets a more sober assessment…
Yah, I’m a new ps4 owner, and what’s with that? I have a wireless connection on my pc and can download at insane speeds. I have a wired connection to the ps4 and it’s maybe 10x slower.
I thought that was already a thing in mobile apps? I never buy them so I dunno.
I wasn’t being critical, I thought it was a legitimately interesting and amusing brain mix-up. My friend sometimes says Lock and Peele instead of Key and Peele, for example.
You’re making a last of us joke, right? The character is Joel. Did your brain go to Ethan because of the Coen brothers? That’s kinda funny.
While true, a GTX 1080 and i7-whichever is pretty much the 2016 standard-issue good gaming PC. Even if they half-assed their QA, Kirk’s setup is one they tested. He doesn’t say in the post but obviously he isn’t trying to do this with 4 gigs of RAM or a hard drive he built himself in an EE class. I think part of the…
I believe he’s referring to console versions. Witcher 3 is 30fps on both XBone and PS4, and MGSV is 60.
Maybe take all those impassioned arguments over to Steam and turn it into a user review, since right now it’s sitting at mixed reviews. That’s the main reason I’m not interested in it. Gamespot gave it a good review, but even they called it cryptic occult nonsense.
The DC metro’s automation has been off since the Ft Totten crash years ago so they won’t be smooth or consistent until that is fixed a few years after hell freezes over. But seriously, I think I read they were just starting to have it on during select times in certain places a year or so ago, but maybe they changed…
An i5-6600 being better value for gaming than an i7-4790 isn’t just my controversial unique opinion, I think that’s a pretty common cconclusion. It’s a 100 dollar difference so hey, why not if you’re already dropping money, that’s fine if that’s your approach. I’m just saying you can take those 100 bucks and roll that…
The good i5s are quad-core (maybe they’re all quad?). My 2011 i5 is quad core. The i7s main claims are they enable hyper-threading so you could act like you have 8 physical cores (and back to your point, that still isn’t widely supported in games), and a bigger cache. The cache will help...if you’re not waiting for…
If you can afford the i7, I’d still say get the i5 and roll the savings into more video card. That’ll better maximize your gaming capability for a given budget.
My practical piece of advice is that gaming PCs don’t need more than an i5 (my i5-2500 from 2011, paired with a new video card, comfortably runs Witcher 3 on high settings). 1300-1400 is a huge budget unless you’re also buying a new monitor, mouse, keyboard, etc or if you’re going for 1440 gaming.
The quentessential signs pun.
The Witcher 3 was a great Gwent simulator. Did yall play the sidequest where you and Ciri defeat the Wild Hunt? Pretty good too!