How to reduce input latency and make the game smooth.  (Read 103008 times)

holyspam

  • 1337
  • *
  • Posts: 343
  • Country: gr
How to reduce input latency and make the game smooth.
« on: November 19, 2019, 18:14 »
This is my first post on the forums, so hello to everyone.

I have written a modern basic optimisation guide for ut2004 which i also measured with my phone at 240fps and shock rifle primary to measure input latency both online and offline to make sure it doesn't make the experience worse.

(Recorded clicking my mouse in front of the monitor and then counted frames from the mouse click to the shock primary beam appearing)

I have also managed to find a way to run the game with its native DX8 implementation in fullscreen exclusive mode on windows 10, which cuts another 4-5ms of input latency compared to the dx9 wrapper or "fullscreen windowed" mode.

These tweaks combined allow for 4-8ms input latency in instant action, but the best i could manage online on a 3SPN server was 12-16ms, which i assume happens because of newnet taking an additional tick to process netcode's "startfire" function.

Here's step by step.

Step 1 (Windows 10 only):


Quote
----=================UPDATE 2022==================----

You can run the game in 100% native DX8 mode, by grabbing the dll from the attachment in this link.
https://www.vogons.org/viewtopic.php?f=8&t=47772
Explanation is included in that link.

----==============================================----
While the DX8 API is what the game is made with and is more responsive, *your specific system configuration* might work best with the DX9 converter.
Try them both for a few days


Find your installation directory. For example mine is "C:\UT2004\system"
Paste d3d8.dll inside the system folder.
Alternatively you can right-click on the "Play UT2004" shortcut and choose "Open file location" it will take you directly there.


Step 2 (FPS limiter):

For online play, the game will limit your framerate to a default of 90.
When your netspeed is lower than 10000, this can't be changed, but when netspeed is higher than 10k the game limiter will instead use the "MaxClientFrameRate" value.

The ingame limiter works in 1 millisecond intervals.
This means that the value you set in your ini as "MaxClientFrameRate" is converted into milliseconds. (1000/MaxClientFrameRate)

This results in the following fps limits ingame
100,112,125,142,166,200,250,333 which correspond to
10ms, 9ms, 8ms, 7ms, 6ms, 5ms, 4ms, 3ms.

In order to apply a limit of 100 fps, you should use your desired FPS +1 for "MaxClientFrameRate".
This means MaxClientFrameRate=101 results in 100fps ingame, MaxClientFrameRate=112 results in 111fps ingame, 126 for 125...etc...etc

You can use a limiter like RTSS instead, but you should know that it will increase your input latency by 1 frame compared to the ingame limiter, but you can use *any* value as the limit.

As you reach 200-250 fps, the game engine doesn't work that well because it syncs everything in one main thread and you will need a very powerful CPU(4GHz and above) that can finish all of its work in less than 4ms.

To change your fps limit, you can either find the line "MaxClientFrameRate=XXX" in ut2004.ini.
Or type this in the console ingame.
Code: [Select]
set Engine.LevelInfo MaxClientFrameRate XXXWhere XXX is your desired framerate.
The default limit is 90.

Increasing your FPS will also increase the amount of network updates to the server which will result in higher network latency.
Due to netcode, this is mostly irrelevant, unless you have data limits on your connection.
You can adjust the rate of network updates from F7, with the "Desired Net Update Rate" slider.
If your system cannot achieve the set framerate, it will drop to the next available FPS, this can create stutters and warps that lead to desynchronisation with the server, MORE is not always better.



Step 3 (GPU settings):

Nvidia
Go in Nvidia control panel.
Manage 3D settings.
Programs settings.
Choose the ut2004.exe profile, or make one.
Find an option named "Maximum Pre-Rendered Frames" or "Low Latency Mode".
This can help with games that can achieve VERY high fps like UT2004, but can mess with newer games.
DO NOT SET THIS GLOBALLY.
Change it to 0/1 or Low/Ultra.
Find "Power Management" and set it to Maximum Performance

AMD
One thing to enable, is called Radeon Anti-Lag.
Open Radeon settings.
Go in Graphics.
Find UT2004.
Check that Anti-lag is on.
The setting equivalent to nvidia's "Max pre-rendered frames" is no longer included in the settings, it requires a registry tweak and it's applied globally.
For this reason i will not provide a guide, but only the name of the program and the setting.
Use RadeonMod, find the value "FlipQueueSize"
I am using 0x3100 which is equivalent to nvidia's max pre-rendered frames of 1 or low-latency mode "Ultra".
Values are 0x3100 for 1 frame, 0x3200 for 2 frames and 0x3300 for 3 frames(also default), beyond that you're adding input lag.

Intel users, sorry. Get a real GPU. :D

Step 4 (Ingame):

Make sure "Reduce Mouse Lag" is not checked. (Settings->Input)

Best is to set all graphics settings to Lowest or off.


Step 5 (Windows 10+ only):

Right click on ut2004.exe, or the desktop shortcut, open Properties.
Go to Compatibility.
Check "Disable fullscreen optimizations".
Make sure everything else is unchecked.
IF YOU USE ANY OVERLAY, IT MIGHT CAUSE A SYSTEM/DRIVER HANG ON OLD DRIVERS.


Optional


For APU/iGPU users
Open ut2004.ini, find
Code: [Select]
AvoidHitches=False make it True.
This will make the game keep everything into VRAM. It helps a lot on my integrated gpu.(ryzen 2200g)
Default is False.

1000Hz mouse users
If you have a 1000Hz mouse, open user.ini and find "MouseSamplingTime=" change it to 0.001 for 1000Hz, 0.002 for 500, 0.004 for 250.
Default is 0.0083

Potentially harmful tweaks
KeepAliveTime
No, that setting has absolutely no gameplay value.
It does exactly what it says it does - makes sure the connection doesn't time out and close if data is not being sent.

This also doubles the amount of data you use and increases your ping.
AVOID IT.
Default is 0.2
Quote
----================UPDATE 2022=================----
Miasma now provides a custom movement tickrate slider in-game(F7 menu) to achieve what everyone thought KeepAliveTime did.
----==============================================----
[/size]

Process priority
While it helps at first, it can cause synchronization issues after some time between the different components of the game, like network, input, audio and display. Better keep it default.


Running the game on one core
This provides the lowest input lag and most consistent feeling, but you leave a lot of performance on the table. You will need a very powerful CPU to maintain high fps - meaning 120 or more - in crowded servers and maps.

The reason is that UT2004 was coded in an era of single-core CPUs with frequency that remained at 100% and the game was targeted at Windows XP.

The timer function used by the game(called TSC or __rdtsc) is now considered legacy or obsolete and  is not accurate for systems with powersavings, turbo, or when the main game thread moves from core to core.
A way to get around this issue is disabling powersaving, disabling turbo, disabling C-states and using the windows "High Performance" or "Ultimate Performance" power plan.

To pin the game on one core, open task manager, find the game process(ut2004.exe), right-click, set affinity and select only one core.

There are also a few other tweaks like enabling MSI/MSI-X mode for your devices and using CPU affinity for USB controllers, but those are system-wide and might create issues with other games or programs, or make things worse if you don't know what you're doing.

Enjoy!
« Last Edit: August 19, 2022, 13:11 by holyspam »

kops

Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #1 on: November 20, 2019, 16:26 »
Thanks for posting this up here. It looks like a number of people have already found this helpful.

Piglet

  • 1337
  • *
  • Posts: 3258
  • Country: gb
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #2 on: November 20, 2019, 17:41 »
Questions:

1. How does the game know to use the dll?

2. What does the console command do? The variable is defined as "var globalconfig float MaxClientFrameRate;" - so I can't see what providing "142,166,200,250" to it is intended to do. It can only take a single floating point number.

3. How does changing MaxClientFrameRate through console have any effect different to just configuring it in the ut2004.ini file?

4. Changing MaxClientFrameRate needs to be done in conjunction with a change in netspeed to have any effect on uncapping the frame rate. Something like this is a good idea - press numpad 8 at the start of every map to take effect: NumPad8=Stat Net | stat fps | netspeed 10880

5. AvoidHitches? What is it and what dos it do? I can only see its value being reset in the code - so can't tell what it's doing.

Nardaq_NL

  • 1337
  • *
  • Posts: 454
  • Country: nl
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #3 on: November 20, 2019, 18:06 »
1 I think its looking first in the root map (ut2004/system) and then looking for the DLL in the windows/system folder.
and which one to use?
[Engine.Engine]
-> RenderDevice=D3DDrv.D3DRenderDevice
-> RenderDevice=D3D9Drv.D3D9RenderDevice

2 The FPS steps has to do with this

3 none  ;D

4 Yup, I use tab for that  "Tab=ScoreToggle | setspectatespeed 5000 | netspeed 15000" (setspectatespeed is also freeing handy)

5 see here

This will not apply for me as I'm still using Windows 7  8)
« Last Edit: November 20, 2019, 18:09 by Nardaq_NL »

Piglet

  • 1337
  • *
  • Posts: 3258
  • Country: gb
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #4 on: November 20, 2019, 18:47 »
Yeah but you can't set Engine.LevelInfo MaxClientFrameRate 142,166,200,250. You have to pick a value and match with netspeed

jiRNGen

  • Full Member 
  • *
  • Posts: 73
  • Country: fr
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #5 on: November 20, 2019, 19:13 »
I use a 144 hz monitor and nice internet speed so  MaxClientFrameRate=250 and MaxClientRate=20000 or 30000 works perfectly for me. One important setting UseVSync=False (everywhere) will prevent a lot of input lag and add comfort especially when moving mouse around.
 



 
« Last Edit: November 20, 2019, 19:19 by Zinst »

Piglet

  • 1337
  • *
  • Posts: 3258
  • Country: gb
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #6 on: November 20, 2019, 19:26 »
What's the reason for max rate over monitor refresh rate?

jiRNGen

  • Full Member 
  • *
  • Posts: 73
  • Country: fr
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #7 on: November 20, 2019, 20:44 »
What's the reason for max rate over monitor refresh rate?

I tried to set MaxClientFrameRate to 144 before and recently  200-250, i didn't see big difference.

holyspam

  • 1337
  • *
  • Posts: 343
  • Country: gr
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #8 on: November 21, 2019, 17:21 »
Questions:

1. How does the game know to use the dll?

2. What does the console command do? The variable is defined as "var globalconfig float MaxClientFrameRate;" - so I can't see what providing "142,166,200,250" to it is intended to do. It can only take a single floating point number.

3. How does changing MaxClientFrameRate through console have any effect different to just configuring it in the ut2004.ini file?

4. Changing MaxClientFrameRate needs to be done in conjunction with a change in netspeed to have any effect on uncapping the frame rate. Something like this is a good idea - press numpad 8 at the start of every map to take effect: NumPad8=Stat Net | stat fps | netspeed 10880

5. AvoidHitches? What is it and what dos it do? I can only see its value being reset in the code - so can't tell what it's doing.

1.
For DX8 games the executable looks for d3d8.dll, if it can't find it near, windows handles it.(which also applies to all .dll files required by programs)

D3D8.dll is the name of the windows Direct3D 8 library responsible for rendering in DirectX 8 and DirectX 8.1 games.
It was used in Windows 98/XP maybe 95/2000 too, cant remember.

Even when DX9 released it support was provided for earlier versions.
The 32bit version of UT2004 uses DirectX 8.1

DX9 was not included in Win Vista and 7, there is only DX10-11 included and in order to play old games you have to install "DirectX 9 Redistributables" by yourself, which also provides support for DX8 and earlier, so this dll might not help much.

Then for windows 8/10 m$ decided that DX9 should "run" inside DX11...but only D3D9 is "properly" supported
They called it D3D9 Extended(D3D9Ex)

Using earlier versions of D3D you get something like an emulation inside D3D9Ex, which adds complexity in the rendering process and creates issues for old games, since it runs everything on D3D11.

So these guys got the SDK(software development kit) for DX8 and DX9 and instead of emulating DX8 or looking for quick solutions, they translated all DX8 code to DX9 which requires lots of testing and feedback, almost impossible for one person.

But the result is much faster than being emulated and allows you to make every game to actually work properly.

When people started using it for stuff like ReShade, it got lots of feedback and old games ran on newer windows without issues, they even fixed some graphics bugs in games while translating, even fixed some old unpatched crashes.

Anyway they put all of this inside a d3d8.dll so you won't have to mod your exe or any file.

So instead of this:
D3D8->D3D9
D3D9->D3D11
you get something like this:
(D3D8)D3D9->D3D11

For me this changes the game from MEGA stutters and lag, to super smooth.


2.
The console command can be used to change the variable MaxClientFramerate in class LevelInfo inside class Engine,
You can do this without exiting the game and opening the ini, that's the only advantage, but you have to do it in the server browser, otherwise it's blocked.
You can use this syntax for EVERYTHING inside ut2004 and user.ini:
Code: [Select]
set CLASS PROPERTY VALUECLASS is indicated inside brackets like [Engine.LevelInfo]
PROPERTY is plain text followed by "=" symbol and then the VALUE specified.

3.
Numbers provided are the limits that ut2004 actually enforces depending on system timer used, there are more details in "The holy grail of ut2004"
I think the render engine works at 1ms timers and you get one frame every X ms, which results in these actual limits.
8ms = 125fps
7ms = 142fps
6ms = 166fps
5ms = 200fps
4ms = 250fps
If i pick a framerate inbetween, it will oscillate between the two limits.

4.
Yes you need 10001 netspeed in order to unlock framerates higher than the default of 90(maybe 85 on certain systems), there is still a hard limit at 250fps with 10001 netspeed, but if you increase netspeed even more, the limit is what you set in MaxClientFrameRate.

5.
Well, it does what the name implies, it probably sacrifices some speed for consistency in framerates (the difference might have been bigger on older systems),  not sure on the technical side you'll have to ask Epic for that.
It will try to hold all textures inside VRAM, to avoid moving them back and forth. I was mostly testing various stuff from the .ini and noticed that when it's turned on it made my mouse movements smoother.
« Last Edit: January 19, 2021, 11:08 by holyspam »

hagis

  • 1337
  • *
  • Posts: 404
  • Country: gb
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #9 on: November 22, 2019, 16:00 »
hello :)

I don't tend to mess with any settings generally but thanks for posting, it's interesting :)

holyspam

  • 1337
  • *
  • Posts: 343
  • Country: gr
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #10 on: November 26, 2019, 23:24 »
One extra setting to reduce input lag for nvidia users.

DO NOT CHANGE GLOBAL SETTINGS FOR THIS ONE, IT MIGHT NOT WORK WELL FOR ALL GAMES.
THIS WILL DROP YOUR FPS BUT YOU GET FASTER FRAME DELIVERY


1.Go to Nvidia control panel
2.Manage 3d settings
3.Program settings
4. Find unreal tournament 2004, if it doesnt exist,  ADD IT.
5.Now edit ut2004 profile, find "Maximum pre-rendered frames" set it to 1.

That's it!

This might make a huge difference, depending on your setup.




For AMD users you need to edit registry or download radeonmod, i won't go into details because it's dangerous, the setting name is Flip Queue Size.
« Last Edit: November 26, 2019, 23:26 by holyspam »

Piglet

  • 1337
  • *
  • Posts: 3258
  • Country: gb
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #11 on: December 02, 2019, 23:28 »
d3d8.dll has been causing GPFs for me.

Renamed it. GPF's stopped.

Nardaq_NL

  • 1337
  • *
  • Posts: 454
  • Country: nl
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #12 on: December 06, 2019, 16:12 »
Anyone has a solution for this? It's making me nuts for a very long time.

I've no idea how others are able to shoot a close range to a moving manta for example, but i have to predict ahead for me able to hit anything.


The aim is not off. Its the delay of shooting vs actual firing
« Last Edit: December 06, 2019, 18:52 by Nardaq_NL »

jiRNGen

  • Full Member 
  • *
  • Posts: 73
  • Country: fr
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #13 on: December 07, 2019, 00:20 »
Not sure if it's same as what i had, but i fixed it in input i disable  every fields.
And in ut2004.ini i changed set UseVSync=False everywhere.

 Vsync adds a lot of input lag... it's unplayable for me.

Nardaq_NL

  • 1337
  • *
  • Posts: 454
  • Country: nl
Re: How to reduce ut2004 input lag - no placebo (Win8 and newer)
« Reply #14 on: December 07, 2019, 12:29 »
Tried the dll file, fps v sync, got a new monitor, 64bit, and many other I've forgotten.