A few days ago while I was playing with the settings on my laptop, I decided to see how overclocking the GPU would affect the hashrate of cryptocurrency mining. From my understanding, a GPU running at a higher clock frequency should be able to perform more calculations per seconds, thus increasing the mining hashrate.
I've previously mined on my desktop workstation which has an AMD Radeon RX 570 GPU. My laptop on the other hand has a Nvidia GeForce GTX 1060 graphics card. Up until recently, Nvidia GPUs haven't been widely supported by mining software. The only mining software I've tried yet which can easily utilize Nvidia cards is the MinerGate miner. This is also the most user friendly miner I have ever used. I'm sure someone reading this has experience with other Nvidia friendly miners, so feel free to enlighten me.
Normal clock frequency
Anyways, I did a test run mining Monero at normal clock frequency (1835MHz). This yielded a hashrate of 426H/s from the GPU.Turbo
Next, I set the overclock setting to Turbo (lol), This bumped the clock frequency up to 1873MHz.Still, even at this higher clocking frequency, the mining hashrate didn't change. It was stuck at 426 H/s.
The number below the GPU hashrate shows the ten second average GPU hashrate. As you can see on the picture above, the ten second average hashrate appears to have increased from 384 H/s to 416 H/s after overclocking was enabled. However, the average was constantly varying, so it's just a coincident that it was higher with overclocking.
This leaves me wondering: Did overclocking not make the GPU mine faster? Did it mine faster, but the clock went faster as well, so the measurement wrongfully showed the same number? I would love some input, so if you are in the possession of some relevant knowledge or ideas, make sure to share them in the comments section.