The discussion has moved over to Discord

Home Forums Overload Development Now I've had enough

RSS Feed

Tagged: , , , , ,

This topic contains 60 replies, has 12 voices, and was last updated by Kryyss Kryyss 11 Jul 2018 @ 5:08am.

Viewing 15 posts - 1 through 15 (of 61 total)
  • Author
    Posts
  • #17665
    Moon
    Moon
    Kickstarter Backer
    Topics: 6
    Replies: 10

    I need a new computer, mine is too old and slow. I’m quite blown away, a buddy’s 4K monitor impressed the hell out of me so I’m thinking about going all in. But what hardware specs do I need for a full Overload experience in 4k?

    Graphics card with Shader 3.0 support (1GB+ VRAM recommended)
    2GHz Dual core processor or higher (i3 or higher recommended)
    4GB of memory
    10GB of HD space (rough estimate, could be higher)

    translates to what CPU, GPU and RAM?

    #17666

    bwabbit
    Participant
    Topics: 2
    Replies: 18

    I doubt you’d gain anything from having more than 8GB of RAM. For CPU a high-end i5 is probably fine. GPU would be the most critical component and a case of the-faster-the-better if you want solid 60+ fps at 4K with everything including screen-space reflections turned on – at least a GTX 1070 I figure.

    #17667

    hypersonic
    Kickstarter Backer
    Topics: 18
    Replies: 220

    If you have just a few programs open at the same time 8GB of RAM might be enough, but 4x8GB of RAM doesn’t cost that much and allows one to have many programs in RAM at the same time to quickly switch between apps. Terabyte drives are really cheap. 1080 GTX or 1070 Ti both run 4K well. I’m not that familiar with current CPUs, but a i7 6700 from 2 years ago seems to run just fine.

    Same Moon from http://moon.descentforum.net/Descendarium/ ?

    #17668
    Moon
    Moon
    Kickstarter Backer
    Topics: 6
    Replies: 10

    Yep.

    I was thinking about 8 GB x2 RAM and when it gets cheaper another 16 GB. 1080 GTX is too expensive so I’m gonna aim at 1070 (ti).

    #17669
    TwoCables
    TwoCables
    Kickstarter Backer
    Topics: 118
    Replies: 1474

    I have 8GB of memory and it’s always much more than enough. Unity needs a Page File though, so make sure it’s set properly.

    My i5-2500K at 4.5 GHz and my slightly-overclocked GTX 780 run Overload extremely well at 1080p with all of the in-game options set as high as they go, but I do have Screen-Space Reflections off. It’s not worth the big performance hit.

    I also have G-SYNC on a 144 Hz monitor, and my usual framerates are 100-120-something with nice low frametimes of around 10ms.

    So yeah, this game isn’t demanding at all.

    Prepare for Overload…

    #17670

    hypersonic
    Kickstarter Backer
    Topics: 18
    Replies: 220

    Can’t really go wrong either way, GTX 1080 has a slight performance lead with a slightly higher pricetag

    $480 average for GTX 1070 Ti
    $560 average for GTX 1080

    2x8GB with the option of adding another 2x8GB is a good idea, the 16GB sticks are still pricey.

    1080p is around 2.1 mega pixels
    2160p is around 8.3 mega pixels

    2.1 MP x 120hz = 252 MP/s
    8.3 MP x 60hz = 498 MP/s

    So 4K @ 60hz is twice as demanding as 1080p @ 120hz for fill rate.

    For VR, the Vive Pro is 1400 x 1600 per eye, or 2.2 MP per eye. 2.2 MP x 90hz = 198 MP/s per eye, almost 400 MP/s considering there are 2 eyes. Though it has to render 2 views per frame rather than just one (but these 2 GPUs are optimized to handle multiple views at the same time.)

    A Gsync monitor is nice to have, though many seem to come with only the AMD equivalent. Some newer ones even have 4K w/HDR.

    At least as important as average are the lowest frametime and steadiness of frametime, not sure if benchmarks include these stats.

    EDIT: corrected res of Vive Pro

    #17677
    ziqidel
    ziqidel
    Participant
    Topics: 3
    Replies: 9

    I have a top end new computer, but I still had to set the textures and resolution low to stop the frame rate from dropping and greatly reducing my ability to play especially on intense situations like Hive in Challenge Mode. It was still good though. I like the fact that they are still on the forefront of graphics and AI, as the Descent series always was.

    #17679
    Pumo
    Pumo
    Participant
    Topics: 4
    Replies: 27

    Just out of curiosity ziqidel, what are your specs?

    I found the game to run pretty well on mi second gen i3 with my GeForce 1050 ti at 1920×1080 resolution, just with some frame drops every now and then when ambient occlusion is enabled, and the only setting I had to totally disable was screen space reflections, but overall, I can play the game at stable framerates when using high texture resolution.

    Also, what are your graphic settings on the game?

    http://pumosoft.3d-get.de – Pumo Software official Website
    – Pumo Mines progress: 60%

    #17684
    Haunted Parrasp
    Haunted Parrasp
    Kickstarter Backer
    Topics: 31
    Replies: 428

    I’m running the AMD equivalent of an i5 (possibly slightly better) and a GTX 1060 and I can run the game at 1920×1080 with all settings to max and nary a stutter.
    You shouldn’t need too much more than that to run it at 4k.

    Ship’s cat, MPSV Iberia
    Check out his original music @ http://vertigofox.bandcamp.com/

    #17712
    Moon
    Moon
    Kickstarter Backer
    Topics: 6
    Replies: 10

    Really? Given the number of pixels has quadrupled I find that a bit hard to believe.

    #17733
    ziqidel
    ziqidel
    Participant
    Topics: 3
    Replies: 9

    Just out of curiosity ziqidel, what are your specs?

    I found the game to run pretty well on mi second gen i3 with my GeForce 1050 ti at 1920×1080 resolution, just with some frame drops every now and then when ambient occlusion is enabled, and the only setting I had to totally disable was screen space reflections, but overall, I can play the game at stable framerates when using high texture resolution.

    Also, what are your graphic settings on the game?

    Intel Core i7-8550U, 1.8GHz, Quad core, 8GB RAM (as far as I know that’s in total, not each).
    Intel UHD Graphics 620 I think is the graphics card…

    I also have a 4K external monitor which mostly works fine. Sometimes watching certain movies it disconnects itself for a few seconds at random times, but that has never happened on Overload.

    It started off that I could play with all the graphics settings on and the second-lowest resolution, but it was still a bit jittery at times. I fiddled with some other settings but they seemed to have no effect (mirrors, shadows, anti-aliasing). The ones that had the biggest effect seemed to be texture quality and screen resolution. So I think I have those set on lowest when I would rather have smoother play.

    I expected it to have much better performance based on these specs, so maybe it’s some new Windows 10 or Steam feature hogging resources but I can’t see anything sticking out on Task Manager. Or maybe the graphics card isn’t that good. I think the performance is a bit better in offline mode too.

    #17745

    bwabbit
    Participant
    Topics: 2
    Replies: 18

    Intel Core i7-8550U, 1.8GHz, Quad core, 8GB RAM (as far as I know that’s in total, not each).
    Intel UHD Graphics 620 I think is the graphics card…

    That’s integrated graphics – I assume your PC is a laptop – they’re always a lot slower than a dedicated graphics card. Some high-end gaming laptops do have dedicated cards I believe but they’re more common in desktop PCs. That CPU is fine – it’s actually 1.8Ghz base-rate, but gets automatically boosted up to as much as 4Ghz when needed.

    #17847
    Moon
    Moon
    Kickstarter Backer
    Topics: 6
    Replies: 10

    Also, what about those monitors with V-Sync? Are they necessary these days for judde-free gameplay?

    #17848

    hypersonic
    Kickstarter Backer
    Topics: 18
    Replies: 220

    Found a comparison article https://www.digitaltrends.com/computing/nvidia-g-sync-or-amd-freesync-pick-a-side-and-stick-with-it/

    Some quotes

    “While Nvidia’s G-Sync is enabled by including a chip in the construction of the monitor, FreeSync uses the video card’s functionality to manage the refresh rate of the monitor using the Adaptive Sync standard built into the DisplayPort standard.”

    “While G-Sync is proprietary Nvidia technology, and requires the company’s permission and cooperation to use, FreeSync is free to use, and implementing it is a goal of the program, not a way to make money.”

    “You have to choose whether you want to go with Nvidia or AMD, and then purchase a monitor and GPU accordingly.”

    It would be nice if Nvidia cards could make use of Freesync as well.

    #17849

    bwabbit
    Participant
    Topics: 2
    Replies: 18

    Adaptive sync (G-Sync / FreeSync) isn’t really any more necessary than it would’ve been a decade ago. Depends how bothered you are by frame-rate jitter and the slight extra latency with plain-old VSync/Triple Buffering. Think my monitor only runs at 60Hz so when frames get ‘dropped’ it’s quite noticable but it still doesn’t bother me much. I’d personally be more concerned with getting a fast GPU and then second to that a monitor with a high refresh rate – especially at the moment with NVidia trying the whole ‘vendor lock-in’ thing with G-Sync, even though it doesn’t look like they’re going to abandon it any time soon.

Next Unread Topic:
Viewing 15 posts - 1 through 15 (of 61 total)

Forums are currently locked.


Previous Topic: About the Stratalustar… Next Topic: Multi-monitor support?