B16CXHatch
B16CXHatch
B16CXHatch

My old roommate and I spent the entire time playing Odin Sphere, jaws agape at how beautiful the game was. However, that time was kinda short. It bored us tears after just a couple of hours. The game was longer than it should have been, and a serious grind.

Depends on the person. I'm a sucker for special edition crap. I try to get them when I can. I'm bummed that I couldn't get the Wizard's Edition of Ni No Kuni (was desperately broke at the time). But I have a bit of a collector's mentality. I like to get the top version of every game, and when I buy used, I want it to

Phew. Glad I already got my Tales of Xillia Collector's Edition order in at GameStop already.

Sakura Wars is steampunk. Errr well I know the one that I played is (So Long My Love). I think they are all a single universe so I'm guessing they all are.

I'm not sure which it was, but I had to have been brought home by either 82 Chevette Diesel or 86 Sunbird.

They are different. Some are better than others too. I recommend, if you like more story driven and action oriented RPGs, to go with the Iris series (2 in particular) and try the Mana Khemia games too (not named Atelier but still considered Atelier projects). Those will give you some familiarity while introducing the

I have a slight problem in that, yeah, I'd love more JRPG's to make it over but I have such a ridiculous backlog of older games, going back to SNES and PS1 that having even more would just pile it on.

My thoughts exactly. This commenting system blows. Even the new system on Jalopnik is better, but only just.

Ha! Too right. And it's actually recent with me. Had a bitch of time getting Super Mario Brothers 3 to play the other day. Maybe I should hunt down a top loader or get one of the new multi-consoles (NES, SNES, and Genesis all-in-one).

The price or performance had nothing to do with it. I was saying that the original graphic that was posted before the article was updated with the full specs is so vague that it basically just described my and about 5 million other PC's.

The problem is that no one is reading the other comments. The joke made more sense before the article was updated. Now that it has, the joke means nothing.

That's definitely true but when it comes down to raw power per core, AMD loses on the whole. It breaks my heart honestly. I was such a big AMD fan for years but I had to commit the blasphemous act of building an Intel based rig for my latest system because even with fewer cores, they were still outperforming big time

That's actually a pretty good point. All I was saying though, the original graphic was so sparse on info, it was just basically describing a PC I already have. Now that there's some actual specs, it's a little more impressive but as another commenter pointed out, even a current generation mid-range graphics card is

It's all good man.

For the umpteenth time. I'm not some PC master race asshole trolling consoles. I was making a joke of the lack of info the original graphic was presenting. I actually love consoles and for the time being, use them more than my PC to game. The PC is almost exclusively an FPS and racing machine while the consoles handle

Says the dude who doesn't read. I was making a joking comment about the lack of information the original graphic was presenting. I was saying, that it just basically described my computer. Now that some real specs have been posted, I made a more serious comment.

Yes I know this. I was stating that the PS4 has 8GB of unified RAM shared throughout the system and I was marking a smartass comment with the HD 3000. It isn't even enabled on my computer. Everything from 2D desktop mode to 3D gaming mode is handled entirely by my GTX 560Ti.

Ok, for anyone who still thinks I'm serious, now that some real specs are listed instead of that kinda goofy graphic (which is what I was poking fun at), it's actually kinda impressive. For now at least.

The Intel HD 3000 integrated into my CPU. Unified memory buddy. That means everything shares the same memory. My GTX 560Ti has 1GB of its own dedicated GDDR5 Memory.