Nvidia GTX 580


micro181
 Share

Go to solution Solved by micro181,

Recommended Posts

Tommy's pic is right but with the 580 40% should be fine... I am guessing the laptop CPU maybe holding things back rather than the GPU.

setting it to zero does help slightly with the redraw speed, it just makes any thing angled or curved more "jaggy" until you do a final view.

not sure if X6 works differently or not as far as using hardware AA etc....

Link to comment
Share on other sites

Hi Drawzilla,

 

it only lets me change them to 150 and the quality is slightly worse but still does not speed up the 3D viewing - Must be a conflict with X1 as its ok in newer versions - I may have to invest in X6 sometime - Thanks for all your help

Link to comment
Share on other sites

I think the 1500 and 600 in Micro's DBX is referring to mm not inches isn't it?    which is 60" and 24"  very approximately.....

 

try unchecking  "keep all surfaces"   I think X1 is drawing EVERY thing in the 3d image even when it isn't necessary , eg behind walls etc.

 

hit "Help"  when in the DBX and see what it says about "keeping all surfaces"

 

 

M.

Link to comment
Share on other sites

Hi Kbird, you are correct mine is in mm. I unchecked the "keep all surfaces" but alas it still is slow

 

when I say slow, it's when you use your mouse and their is a delay - Not in any of the newer versions - At the end of the day, it still works, but not as smooth

Link to comment
Share on other sites

A lot has changed with regard to rendering since X1 was released. I am somewhat surprised that the NVIDIA card is slower, but there are some really good ATI cards out there. The fact that X6 is a lot faster than X1 is no surprise though as we have worked on making OpenGL use more modern techniques, especially with X6.

Link to comment
Share on other sites

You shouldn't use hardware edge smoothing and software edge smoothing at the same time.  Software edge smoothing is capable of achieving better results than hardware edge smoothing but will dramatically decrease performance as it basically involves rendering each frame several times (between 2 and 15 times, depending on the level the preference is at).  You can therefore expect performance to drop by a factor of roughly between 2 and 15 (again depending on the preference).

 

Unless you're using a very low end GPU that either doesn't support mid range MSAA or supports it but suffers dramatic performance loss I would recommend using only hardware edge smoothing for previews and bumping up software edge smoothing for final views.

Link to comment
Share on other sites

Hi All,

 

I've worked out a work-a-round - I can put together a design and when I want to view it with the best 3D graphics I use the 2014 home designer - I will do this until I up-grade to X6

Thanks to all for your input on this issue

 

By the way, I did have the AMD 9900 card until Dell changed it to the Nvidia and it was perfect with the AMD??

 

X6 here I come - sooner than later

Regards

Mike Summers

Link to comment
Share on other sites

The newer programs would automatically use multiple cpus (cores) but with X1 you had to check a box for that in the preferences>general area ,is it on? ....also you can set the NVidia driver to use threaded optimisations too if you haven't for X1.exe

Link to comment
Share on other sites

Sorry thought X1 was where multiple cpus where 1st introduced to CA ( X2 perhaps?) but at the time not everyone had dual core cpus ,which is pretty much standard now, if not quad core.

 

Check the reference manual/online help for "Optimize for Multi-Core CPU's"   , it was a troubleshooting option.... 

Link to comment
Share on other sites

check to see if other programs list the intel gpu or if it's just X1.  you may need to get into the bios of the laptop at startup, the bios may also be telling the laptop to use the intel gpu instead of discrete gpu altogether.

 

also in the bios check to see if "Optimus" is enabled.  That may not be the exact name.  This is a feature of nvidia gpu's that will default to the onboard gpu instead of the discrete gpu when not needed.  it is meant to save power.  I always disable it because it never fails to get in the way.  you may be able to disable it in the nvidia control panel as well.  the other option is to go into the device manager and look under display adapter.  if the optimus feature is enabled you will see 2 cards listed, the intel/default and the nvidia card.  (after you check the bios) you can right-click the intel adapter and disable it.  it's possible an older program may not know how to handle that feature.

 

finally, i have had real problems actually removing AMD drivers from workstations/laptops over the last 3-4 years.

try using driver fusion (after v1.2 you may have to pay but it used to be free) IN SAFE MODE to search for and remove AMD display drivers

(check to see if other drivers are installed and using these folders first before proceeding)

then in safe mode, find any AMD/ATI folders in the c:\ and  \PROGRAM FILES and \PROGRAM FILES x86 and \PROGRAM DATA and %TEMP% (type that into the windows address bar).

AND delete them.

i would also install piriform ccleaner, have it check for registry problems, and fix them.

 

I second the others thoughts on the driver version.  I usually have to find the "just right" version, which is not always the latest. 

Link to comment
Share on other sites

Hi Mike , I needed to install some older software earlier this week and to do that I had to install Win7 XPMode 1st , it works great, but I had the thought that since you are on Win7 you can probably do the same , and install X1 in XP rather than in Win7 as original intended.  

 

The only thing I am not sure about is what the Video enhancements/3D features are supported in XpMode?   Others may chime in hopefully....

 

Get the file with  -N  in the name unless you need/want windows media-player in xp.

 

I was linked to this page , to do it , hope it helps...

 

 

http://windows.microsoft.com/en-us/windows7/install-and-use-windows-xp-mode-in-windows-7#section_3

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share