Pytanie Cat wont use GPU instead uses Intel 4000
- John-Aaron Hill
- Autor
- Wylogowany
- Newbie
- Posty: 2
- Otrzymane podziękowania: 0
Proszę Zaloguj lub Zarejestruj się, aby dołączyć do konwersacji.
- John-Aaron Hill
- Autor
- Wylogowany
- Newbie
- Posty: 2
- Otrzymane podziękowania: 0
Proszę Zaloguj lub Zarejestruj się, aby dołączyć do konwersacji.
- dakson dayllom
- Wylogowany
- Newbie
- Posty: 5
- Otrzymane podziękowania: 3
in that case I recommend trying to disable the Intel 4000 graphics directly in the BIOS.
if not possible disables through BIOS (this is possible on my Dell Latitude E6520) and try to disable the Intel 4000 graphics on the control panel of Nvidia GTX 670 MX also did not work.
so I know a risky maneuver that could work.
already tested a time ago in my notebook, I can not remember if it worked.
is the follows
try disables on the device manage from Windows, after their graphs become ugly, restart the notebook and wait for the OS to force the use of another form of rendering, which may end up being your CPU.
if don't worked, just able the Intel 4000 graphics device again and reboot.
sorry the english, I never took it seriously practicing english
to speak the truth ... I learned English with games, is why use a translator to improve the text
Proszę Zaloguj lub Zarejestruj się, aby dołączyć do konwersacji.
- Aaron Grizzle
- Wylogowany
- Newbie
- Posty: 1
- Otrzymane podziękowania: 0
dakson dayllom napisał: will consider you tried everything in your nvidia panel.
in that case I recommend trying to disable the Intel 4000 graphics directly in the BIOS.
if not possible disables through BIOS (this is possible on my Dell Latitude E6520) and try to disable the Intel 4000 graphics on the control panel of Nvidia GTX 670 MX also did not work.
so I know a risky maneuver that could work.
already tested a time ago in my notebook, I can not remember if it worked.
is the follows
try disables on the device manage from Windows, after their graphs become ugly, restart the notebook and wait for the OS to force the use of another form of rendering, which may end up being your CPU.
if don't worked, just able the Intel 4000 graphics device again and reboot.
sorry the english, I never took it seriously practicing english
to speak the truth ... I learned English with games, is why use a translator to improve the text
Can't disabled the Intel HD4000 on these systems. the 2 cards act together like the old Monster II cards used to do back in day. Intel runs 2d then GeForce is supposed to kick in during 3D. I have a Sager 9170 with GTX 680 and I have same problem. Cat says it is using the Intel adapter so either the program needs to be updated or drivers from NVidia or Intel need to be updated or the Geforce card is being used and just misreported by Cat... Either way, I am not getting close to the performance the OP is.... my best is 1800
Proszę Zaloguj lub Zarejestruj się, aby dołączyć do konwersacji.
- dakson dayllom
- Wylogowany
- Newbie
- Posty: 5
- Otrzymane podziękowania: 3
as my Dell latitude 6520e has the ability to disable support for optimus (nvidia tech) and use only the dedicated GPU, I decided to test.
after an initial "strangeness" and some reboots ...
the result was basically the same (less than 100 points difference)
what happens is the program (Catzilla) identifies the wrong GPU.
Here you can see the "big" difference with own eyes ...
without disabling:
www.catzilla.com/showresult?lp=151155
disabled:
www.catzilla.com/showresult?lp=151161
and I'm happy because my CPU is a monster (as expected) ... and I know how to partially solve the problem with the GPU, just wonder if it would be possible to use PCI Express 16X slot instead of 1x (or 2x, I have to check).
someone here can send me an PM saying that solutions can use without compromising the output 1394?
Proszę Zaloguj lub Zarejestruj się, aby dołączyć do konwersacji.