The discussion has moved over to Discord

Home Forums Overload Development Now I've had enough

RSS Feed

Tagged: , , , , ,

This topic contains 60 replies, has 12 voices, and was last updated by Kryyss Kryyss 11 Jul 2018 @ 5:08am.

Viewing 15 posts - 31 through 45 (of 61 total)
  • Author
    Posts
  • #17900

    hypersonic
    Kickstarter Backer
    Topics: 18
    Replies: 220

    When they are syncable, but it’s important to be able to gracefully handle cases where they are not, such as when the GPU can crank out more than the display can draw.

    Games should also decouple input/physics frames from rendering frames. For older games where you can pump out a frame every millisecond, it would be rather foolish to render 1000 frames per second when the display is only capable of only 60 or 120.

    However there would be nothing wrong running input/physics at 1000hz and only rendering say 1 out of 10 of those frames. Your mouse sampled at 1000hz wouldn’t be wasted. Although you’d only see some of the frames when traveling around a smooth curve, you would in fact be traveling around a smooth curve rather than along some rough approximation.

    #17902
    Yoshimitsu
    Yoshimitsu
    Kickstarter Backer
    Topics: 57
    Replies: 421

    Seems to me like it depends on the power of your graphics processing. If your graphics card can’t maintain a framerate equal to your monitor’s refresh rate then free-sync/G-sync will help you. In the opposite case where the graphics card can far exceed the refresh rate of the monitor then v-sync or some other form of FPS limiting is better.

    #17904

    bwabbit
    Participant
    Topics: 2
    Replies: 18

    Fast sync appears to be what I’ve described earlier (though I think they could explain it better)
    He states that triple buffering simply fills up the buffers, then basically stops until they are empty.
    In Nvidia settings I just noticed you can enable Fastsync or adaptive sync globally or per program. With Fast Sync enabled I’m not sure how Gysnc or Freesync helps when enabled concurrently.

    Okay, so with triple-buffering it’s a matter of terminology ( https://en.wikipedia.org/wiki/Swap_Chain ). I’m not sure but I suspect some OpenGL games (and possibly some software-rendered games) do ‘true’ triple-buffering as opposed to the Direct3D flavour. It’s a tradeoff between lower-latency and lower-CPU/GPU usage and Microsoft opted for the latter with Direct3D.

    The situation with NVidia’s fast-sync is a little complex but is explained here: ( https://gamedev.stackexchange.com/questions/58481/does-directx-implement-triple-buffering ). I can’t think of any advantage of combining fast-sync and g-sync. I’d guess most games wouldn’t use triple-buffering at all with g-sync/freesync.

    As for the adaptive-vsync option in NVidia’s settings, that’s a whole different thing, heh (it just turns vsync off at low frame-rates basically).

    When they are syncable, but it’s important to be able to gracefully handle cases where they are not, such as when the GPU can crank out more than the display can draw.

    Yep, but this isn’t usually a problem. Generally the worst-case is needlessly wasted CPU/GPU cycles due to discarded frames and a well-written graphics engine should detect that case and slow rendering down to save power and reduce resource-usage. It can be an issue with older games with framerate-dependent-physics though.

    Games should also decouple input/physics frames from rendering frames. For older games where you can pump out a frame every millisecond, it would be rather foolish to render 1000 frames per second when the display is only capable of only 60 or 120.

    However there would be nothing wrong running input/physics at 1000hz and only rendering say 1 out of 10 of those frames. Your mouse sampled at 1000hz wouldn’t be wasted. Although you’d only see some of the frames when traveling around a smooth curve, you would in fact be traveling around a smooth curve rather than along some rough approximation.

    Yup, you know your game-design stuff pretty well! Think that’s the norm these days – Doom3 was the first game I’m aware of that had a fixed physics-update-rate and I know Overload does the same thing.

    Seems to me like it depends on the power of your graphics processing. If your graphics card can’t maintain a framerate equal to your monitor’s refresh rate then free-sync/G-sync will help you. In the opposite case where the graphics card can far exceed the refresh rate of the monitor then v-sync or some other form of FPS limiting is better.

    Adaptive-Sync (FreeSync or G-Sync) are identical to V-Sync if the framerate is equal or greater than the monitor refresh-rate. In that case a delay is introduced and it’s all handled silently by the graphics driver.

    [Let’s see if I can post without breaking the forum this time!]

    #17907

    hypersonic
    Kickstarter Backer
    Topics: 18
    Replies: 220

    It seems that many of iD early engines had fixed physics rates (I believe some source ports removed this limitation, but sometimes altering jump trajectories by a fair amount)
    https://forums.anandtech.com/threads/why-is-the-original-doom-capped-at-35-fps.2444832/
    Doom1 35fps
    Quake1 72fps
    (Quake2 had an annoying fixed server side fps of only 10fps!)
    Quake3 120fps?
    Doom3 60fps?

    It seems that in order to preserve the homing missile curves in Descent 1&2 a fixed physics rate was made just for the homing missiles in the source ports.

    #17908

    bwabbit
    Participant
    Topics: 2
    Replies: 18

    Prior to Doom3 I think physics were updated with the same frequency as the graphics.

    There’s essentially three different ways of doing it:
    1. Fully-framerate-dependent: A door opening takes N frames (obviously bad for any type of game with varying framerate).
    2. Framerate-compensated: Physics updates are still done in sync with generating frames but take into account the elapsed time since the last frame. This gives correct results in simple cases (constant velocity isn’t a problem, constant acceleration/deceleration can be handled fairly easily too) but often gives wrong results for more complex cases such as jumps.
    3. Framerate-independent: Physics updates completely uncoupled from graphics-updates.

    Discussion about it in Doom3: https://forum.beyond3d.com/threads/doom-3-60fps.7181/
    I don’t know which of the latter two Descent used. Possibly a hybrid.

    #17912

    hypersonic
    Kickstarter Backer
    Topics: 18
    Replies: 220

    When Doom1 came out in 1993 people’s rendering framerate was often less than 35hz, but the physics would still operate at 35hz. Problem is that as computers got faster the 35hz started to become a hindrance to very smooth gameplay that started to become possible. Other than homing physics, Descent 1&2 seemed to scale well with higher rendering framerates.

    #17913
    Moon
    Moon
    Kickstarter Backer
    Topics: 6
    Replies: 10

    Thanks. I’m not entirely sold on 4K, might yet stick with FullHD this time around and so far I haven’t had any issues with 60 FPS only gameplay on a friend’s PC. So what GTX 1060 should it be and how is 400 watts power supply enough?

    #17914

    hypersonic
    Kickstarter Backer
    Topics: 18
    Replies: 220

    Nice thing about UHD is that is can also run FullHD very well. A square of 2×2 of UHD pixels becomes exactly 1 FullHD pixel. You have the option to run either res equally well. Just be sure to look for HDMI 2.1 and/or DisplayPort 1.3 support for future computer upgrades. Save yourself from having to upgrade to a 4K display later, which would result in spending more money overall.

    #17917

    bwabbit
    Participant
    Topics: 2
    Replies: 18

    When Doom1 came out in 1993 people’s rendering framerate was often less than 35hz, but the physics would still operate at 35hz. Problem is that as computers got faster the 35hz started to become a hindrance to very smooth gameplay that started to become possible. Other than homing physics, Descent 1&2 seemed to scale well with higher rendering framerates.

    Yeah, I know, I used to play it on a 486SX. 🙂
    You’re right though – kind of – what happens is rendering followed by playing catch up with the physics – in Doom3 (as far as I know) it’s actually a separate thread thereby making it completely independent. Aside from fixing the slight discrepancies (with jumps in particular), doing it the latter way also gives you better input responsiveness (so long as inputs are also checked along with the physics). Kinda half-way between what I numbered 2 and 3 I guess.

    (think we derailed this thread a bit tbh!)

    #17918

    bwabbit
    Participant
    Topics: 2
    Replies: 18

    Thanks. I’m not entirely sold on 4K, might yet stick with FullHD this time around and so far I haven’t had any issues with 60 FPS only gameplay on a friend’s PC. So what GTX 1060 should it be and how is 400 watts power supply enough?

    I would go for a bit higher than 400W personally but if you only plan to have one hard-disk in there (or one SSD and one normal) then it should be fine – NVidia themselves recommend 400W (with the gfx card taking 120W). There doesn’t seem to be much difference between the various GTX1060s except for RAM capacity. You’d probably want at least the 5GB one – even my relatively old GTX760 has 2GB. The 6GB one is a bit faster as well though.

    #17920
    TwoCables
    TwoCables
    Kickstarter Backer
    Topics: 118
    Replies: 1474

    Thanks. I’m not entirely sold on 4K, might yet stick with FullHD this time around and so far I haven’t had any issues with 60 FPS only gameplay on a friend’s PC. So what GTX 1060 should it be and how is 400 watts power supply enough?

    Which 400W PSU do you have exactly? A good quality-made 400W PSU is more than enough, but a generic low-quality one is not. There’s a *LOT* more that can be said, but just for now, try to find out the exact brand *and model* you have.

    Also, getting a G-SYNC monitor with a high refresh rate isn’t a matter of getting away from problems or issues with 60 FPS and 60 Hz, it’s about improving your gaming experience – and tremendously so.

    When I upgraded from a 60 Hz monitor with a native resolution of 1680 x 1050 to a 144 Hz 1920 x 1080 G-SYNC monitor, the differences I experienced surprised me quite a bit. I thought “LOL yeah right, this won’t be a big deal”, but I would say it was a fairly big deal for me. It’s not that I had problems with the 60 Hz monitor, no. I thought, “pff, G-SYNC is just a stupid Placebo Effect gimmick that fills NVIDIA’s pockets with money”. I was dead wrong. I highly recommend that you at least *try* G-SYNC. You can always return the monitor before the return window closes.

    Prepare for Overload…

    #17921

    hypersonic
    Kickstarter Backer
    Topics: 18
    Replies: 220

    Probably wouldn’t notice much difference between a 144hz no-gsync display and a 144hz gsync display. When the screen is refreshed every 7 milliseconds fast sync would probably work just fine.

    I’ve been reading 500w minimum PSU for 1080 cards, 1060 probably not too far off. Usually best off to get a bit more than minimum just to be on the safe side.

    #17922

    bwabbit
    Participant
    Topics: 2
    Replies: 18

    Probably wouldn’t notice much difference between a 144hz no-gsync display and a 144hz gsync display. When the screen is refreshed every 7 milliseconds fast sync would probably work just fine.

    I’ve been reading 500w minimum PSU for 1080 cards, 1060 probably not too far off. Usually best off to get a bit more than minimum just to be on the safe side.

    *laughs* That’s pretty much what I’ve been saying!
    I think hypersonic is trying to wind me up. *laughs*

    Edit: I mean regarding the g-sync refresh-rate thing – As far as the PSU, NVidia say that 400w should be fine for the average setup but obviously it depends what else is in the PC that’s drawing current.

    #17925

    hypersonic
    Kickstarter Backer
    Topics: 18
    Replies: 220

    Not trying to wind anyone up, maybe I’ve read something the wrong way!

    These new 4k 144hz HDR monitors seem awesome
    https://www.pcgamer.com/asus-is-releasing-a-27-inch-4k-144hz-hdr-monitor-with-g-sync-in-june-for-dollar2000/
    https://www.theverge.com/circuitbreaker/2018/5/24/17388154/asus-pg27uq-release-date-price-specs

    However I think these are the monitors with the lossy chroma sub sampling this reddit article was warning folks about
    https://www.reddit.com/r/hardware/comments/8rlf2z/psa_4k_144_hz_monitors_use_chroma_subsampling_for/

    No way would I pay $2000 for a chroma sub sampled display!

    Who came up with 144hz anyways? It’s not a multiples of 8 1/3 ms like the vast majority of other framerates. (4 1/6, 2 1/12)

    #17926

    bwabbit
    Participant
    Topics: 2
    Replies: 18

    No, I just find it funny as I already expressed that I didn’t think there should be a big difference between a high-freq monitor regardless of whether it had G-Sync or not. 🙂

    Where are you from, hypersonic? I’m curious.

Next Unread Topic:
Viewing 15 posts - 31 through 45 (of 61 total)

Forums are currently locked.


Previous Topic: About the Stratalustar… Next Topic: Multi-monitor support?