I’m getting much lower FPS than I would expect at higher resolutions (1920x1200).
showbudget was showing that most of the time was spent in Swap_Buffers:
however, i noticed that the Swap_Buffers was significantly reduced, if i set
mat_viewportscale 0.99999
:
obviously the number of pixels being drawn isn’t significantly different. can anyone explain what’s going on here?
it would seem to indicate something weird in the fragment shaders. is there something in there that’s disabled by having a non-1 viewport scale?
here’s my sysinfo:
Processor Information:
Vendor: GenuineIntel
Speed: 2671 Mhz
4 logical processors
4 physical processors
HyperThreading: Unsupported
FCMOV: Supported
SSE2: Supported
SSE3: Supported
SSSE3: Supported
SSE4a: Unsupported
SSE41: Unsupported
SSE42: Unsupported
Operating System Version:
Windows 7 (64 bit)
NTFS: Supported
Crypto Provider Codes: Supported 323 0x0 0x0 0x0
Video Card:
Driver: NVIDIA GeForce GT 520
DirectX Driver Name: nvd3dum.dll
Driver Version: 9.18.13.623
DirectX Driver Version: 9.18.13.623
Driver Date: 30 Aug 2012
Desktop Color Depth: 32 bits per pixel
Monitor Refresh Rate: 59 Hz
DirectX Card: NVIDIA GeForce GT 520
VendorID: 0x10de
DeviceID: 0x1040
Number of Monitors: 2
Number of Logical Video Cards: 2
No SLI or Crossfire Detected
Primary Display Resolution: 1920 x 1200
Desktop Resolution: 3840 x 1200
Primary Display Size: 26.65" x 16.65" (31.42" diag)
67.7cm x 42.3cm (79.8cm diag)
Primary Bus: PCI Express 16x
Primary VRAM: 1023 MB
Supported MSAA Modes: 2x 4x 8x
Sound card:
Audio device: Speakers (Realtek High Definiti
Memory:
RAM: 8191 Mb
Miscellaneous:
UI Language: English
Microphone: Not set
Media Type: DVD
Total Hard Disk Space Available: 5246124 Mb
Largest Free Hard Disk Block: 1400325 Mb
OS Install Date: Dec 31 1969
Game Controller: None detected