
![]() |

[OMG I am so bitter the Paizoforums ate this thread yesterday]
I just built my new gaming rig in anticipation of PFO, and as I was doing some benchmark/tweaking it occurred to me, "Hey, we don' have PFO right now, but we do have Unity games that might be comparable!" I've been playing a pretty fun sandbox zombie survival game lately State of Decay. It's a third person shooter/survival game that is basically like a more realistic version of The Walking Dead: you desperately try to find shelter, other survivors, scavenge for food, etc., but unlike TWD when you do something stupid you die and there's permadeath.
Just because this is a Unity game doesn't mean it's comparable to PFO, in that (as I understand it), GW will be making tradeoff decisions about how many polygons, materials, bones, etc. to put in. But as this game involves combat between human figures, landscapes, interiors, a single large geographic area (no loadable zones), I think it's a useful data point.
On my old Core2Duo rig, even at mid-level graphics settings the game stuttered a lot during fights or when I was driving around fast and bringing in new views to be rendered. How would it play on my luscious new system?
Specs
MSI Z87-G45 Gaming LGA 1150 Motherboard
MSI N660-2GD5/OC GeForce GTX 660 2GB 192-bit GDDR5 Graphics Card
Intel Core i5-4430 Haswell CPU @ 3.0GHz
Ripsaw 4 GB DDR3 1600 Memory
Windows 7 Ultimate SP1
For the test, I set State of Decay to full HD resolution (1920x1080), settings on "Ultra," and left VSync* on. Basically the game played super smooth and looked nice when I was sighting in on the landscape, and stayed pegged at 60 fps even when I was running over zombies. It was a huge improvement in performance and gameplay upgrading from a 5 year old system.
Just fun fun I turned Vsync off, and depending on what I was doing ranged from 60-80 fps, with no noticeable tearing. I also played around with BioShock Infinite at Ultra settings (ranging from 30 fps to 90 fps depending on what was happening on screen). However, BioShock uses the Unreal Engine 3, and so it's not so comparable I think.
Anyone else have a Unity game that might be useful to test against?
*VSync caps your graphics card's fps at the refresh speed of your monitor, so you don't get "tearing" where different parts of the scene are rendered from different angles. Basically if you have a 60Hz monitor, your FPS will be capped at 60 fps.

![]() |

I can't seem to recall any major 3D MMOs that use Unity at the moment that is released.
I'm also in the SotA alpha and even unoptimized the game seems to run pretty good on my older system.
Might and Magic X Legacy uses Unity, and is a very pretty game. I'm not sure if it would be a great "test", but I'm very impressed with it myself.

![]() |

Being wrote:Huh. I had a post in here and it is gone.The entire thread got lost in cyber space, this is a new thread about syttems.
@Mbando ,so how much did you save by building it yourself and what did the parts cost you?
Parts:
MSI N660-2GD5/OC GeForce GTX 660 2GB 192-bit GDDR5 $189.99Intel Core i5-4430 Haswell 3.0GHz LGA 1150 84W Qua $189.99
MSI Z87-G45 Gaming LGA 1150 Intel Z87 HDMI SATA 6G $159.99
G.SKILL Ripjaws Series 4GB (2 x 2GB) 240-Pin DDR3 $47.99
Antec Three Hundred Illusion Black Steel ATX Mid T $64.99
Total: $652.95
I also cannibalized a Corsair TX 750w PSU ($109 new) I had, my old HDs, and grabbed an optical drive off my shelf.
Really, super happy with the case--really well-designed case, looks cool, runs cool, etc.
I just priced out an identical system at CyberPower PC for $931, so I only saved a $100, but I really enjoyed building it :)

![]() |

Stephen Cheney wrote:I believe State of Decay is actually CryEngine, not Unity. But it's still an awesome game made by great folks local to us, and well worth checking out :) .Curses--I'm an idiot. How did I F-- that up? Now I have to find a good Unity game to test.
Wasteland2 beta is on Steam. But you have to spend coin unless you backed.

![]() |
1 person marked this as a favorite. |

Ok. Once more, with feeling:
So I just backed Gloria Victis, a low fantasy medieval MMO. It's in pre-Alpha (no NDA though), and not very complete. They do have a lot of art assets in though--people, armor, weapons, landscapes, interiors, andlots of trees and grass. Like a lot of vegetation. No effects though--no spells or whatever.
I set the game to 1920x1080, set all the sliders to highest settings: sun shafts, bloom, grass/tree render distance and density, soft shadows, and so on and so forth.
I ran around, and was repeatedly attacked and often killed by uniformly hostile animals--once a boar, but no less than four foxes detected me and launched remorseless suicide attacks that were 50% kill effective.
FPS varied from 50s at a few moments when I turned a corner, but mostly were 60-70s. For your viewing pleasure of a FOR SURE THIS TIME Unity MMO on a mid-range system, you can see:
sunlight,
dark interiors,
dark exteriors, and
dark glades lit with burning q-tips.

Quandary |

Or better, auto-scale/reduce graphics settings based on actual FPS dropping or simply the number of characters in your local region when PVP conditions are met (i.e. non-allied characters are present). This sort of thing would itself probably have a processing overhead, so an option to turn it off for the lowest end machines would probably be preferred, so they can at least get a playable experience.
The trigger based on number of characters present should not be triggered by Stealthed opponents (since doing so would give away their presence via the change in graphics). I guess intra-party PVP would trigger graphics to change once a character who targetted an ally is attacked back in retribution? (since sometimes it's an accident via AoE that doesn't trigger real PVP)

![]() |

Dynamic changes in graphics seem low-yield. If your system starts thrashing because of all of the complex models in memory, then swapping them out for simpler models right when the fight starts is probably going to take a second or so.
If the trigger event is entering battle then I suspect you're right. However, if the trigger event is reaching a threshold of character models displayed, then it seems much more likely. Granted, this side of things is way outside my area of expertise.

![]() |

What features of a graphics card are most important these days?
I've been looking around and there was this side-by-side comparison chart. Texture fill rate seems like it would be a big influence in performance but I haven't paid close attention to the inner guts of computers since the days of getting the master and slave on the right parts of the EIDE cable. Also:
If there's 2GB VRAM on the video card how does system RAM factor if at all? Anything meaningful between 8GB and 16GB for video performance or keeping track of many characters in the area?
Last I heard there was a notable boost in gaming video ability between i3 and i5, and a small optional boost as far as games are concerned with i7. Is that still the case with the newest generation?

![]() |

What features of a graphics card are most important these days?
I've been looking around and there was this side-by-side comparison chart. Texture fill rate seems like it would be a big influence in performance but I haven't paid close attention to the inner guts of computers since the days of getting the master and slave on the right parts of the EIDE cable. Also:
If there's 2GB VRAM on the video card how does system RAM factor if at all? Anything meaningful between 8GB and 16GB for video performance or keeping track of many characters in the area?
Last I heard there was a notable boost in gaming video ability between i3 and i5, and a small optional boost as far as games are concerned with i7. Is that still the case with the newest generation?
- GPU model, core clock (and boost clock), memory size, memory clock and memory bandwidth are all important. A card that sneaks on extra memory, but that has a lower memory bus speed, isn't necessarily better. The best way to pick a video card is to compare benchmarked performance with real games, because it's almost impossible to eyeball decide between two cards when there are multiple variables--card A has more memory and a higher GPU core clock, but Card B has a higher memory clock and bandwidth--which is faster? Answer: No idea. This performance chart is very useful, and very up to date. Tom's Hardware has one also, but they don't have 2014 numbers up yet.
- System RAM and VRAM don't directly interact--having excess VRAM won't make up for a low system RAM and vice versa. Basically, since almost all current games are 32-bit, they can only use 2GB of physical RAM, and so 4 GB of system RAM or more will work fine for gaming. You could easily run your games and let the OS and other programs sit in the other 2GB if you have 4GB os system RAM.
- Yes, the sweet spot currently is at the i5 CPU. The most important performance factor in gaming is your video card, but CPUs do have an impact, and an i3 can be a bottleneck. For example, playing a game like Skyrim, an i3 CPU is a serious bottleneck, but i7 doesn't significantly improve performance, given the price difference.