Bitcoon
Bitcoon
Bitcoon

I can't rub my temples hard enough to continue this conversation. Sorry, I'm dropping it.

It would be interesting, though the downsides of the engine are that you can't really have any motion or shaders in there. No reflective or refractive surfaces, no transparency, no specular highlights... it would be a great way to show off a really well-sculpted, well-painted dragon or something like that, as the

It's actually the other way around. Scanning can be immensely useful to developers, and it's in active use in a lot of AAA studios today. They have to do a LOT of processing and art work to turn scans into usable assets for modern game engines, but it gives accurate results and that's what they need.

The trouble with that is, the lighting is baked in.

Another reply to my first post, woo.

It has a lot of basis. I'm also quite positive about the outlook on this program.

There are a few things about Euclideon you need to know:

It is fully rendered 3d, it's just that the illusion is broken because the specular lighting and dynamic shading... and movement... you're used to are just not there. If you laser-scanned a gold ring, for instance, you'd see every tiny detail of it in picture-perfect quality. But... the way the light plays off the

Oh, this isn't fake by any means. It's actually quite impressive. It's just that the very nature of what they're trying to capture is giving their software fits here. It's never going to work very well, and the result doesn't appear to stand up to scrutiny in this case.

I'm too tired to keep making the same reply at this point. Sorry I won't be getting back to you but if you don't mind doing some digging you'll get some much-needed context on this discussion.

You sound upset. Are you feeling alright?

Oh, I'm definitely all for this tech. I was 100% in support of it back when they were first coming out with information about Unlimited Detail those years ago. I honestly think this is the next step in the evolution of formats for recording our world. We had pictures and video, and now there's this. You can snapshot a

I've had some experience with the approximate version of this. Actually, it's the same technique that they used with the Fox engine, just taking a ton of pictures of someone from different angles and using software (like Autodesk 123D Catch) to parse the image data into a 3D model. It's great for getting a fairly

I was, and still am, quite excited about this technology. I just don't see it as having practical application in modern games beyond capturing realistic and quick data which can be manipulated and utilized to create assets. This certainly isn't the easy-button they're still pushing it as.

It doesn't look any better in the video. Not this scene, anyway.

The problem with that is, this technology isn't made to do that. See, in order to get the effect you describe, you'd need potentially infinite versions of every texture in order to simulate how they would look at every possible point in space. This tech is only applying textures to models so there's no chance to apply

There are problems with this system, though.

Flora is really tough to do in general, because there's a lot of detail in every leaf and blade of grass, and you really can't easily capture that detail to any decent degree of accuracy because you have millions of each to render at the same time. The sheer number of polygons you'd need is ridiculous.

Looks about like I expected. Since they're using actual actors for some games these days, I suppose they do need to use actual captured data like that. Yeah, I'm pretty sure what they do is take that model and use it as a base to create one with proper topology, then projection map the texture from the captured one

It sounds like the worst you get is not being able to play for a while. It sounds like the rules mainly exist to keep the "for fun" online play actually fun. They aren't really going into details on the rules, which I take to mean that they don't intend to enforce them heavily. They want players to know generally how