Share Computer spec thoughts for X13 in mind (planning new build)


VisualDandD
 Share

Recommended Posts

Hey all.  Just wanted to start a thread where people could share some ideas on new box builds for the new coming version.


It has been about 5 years since I built my last box.   Still hanging in there great.   I7 4790k w/32gb  GTX970.   Funny it will still run most of my sons games at 100fps even on high detail settings.

BUT with the announcement that Chief will be in for a significant upgrade, I wanted to start planning  a new build.  Figured maybe start a thread where people can share ideas.

 

I am guessing the RTX3080 may be the 'go-to' card, but it sure comes with a price!


Love to hear ideas of what you guys might go with. 

 

 

Link to comment
Share on other sites

Just my opinion but I still think it would be prudent to wait until you have actually upgraded to X13 and had an opportunity to evaluate how it performs on your existing system. From what I can see/understand the only significant consideration relates to whether or not you feel a desire to Raytrace in Real Time. If so then you will definitely need a reasonably powerful RTX 3000 series graphics card. All I can say is that from what I have seen so far I'm not convinced that CA's Real Time Raytracing in itself is worth spending that kind of money on.

  • Like 1
Link to comment
Share on other sites

5 hours ago, TheKitchenAbode said:

Just my opinion but I still think it would be prudent to wait until you have actually upgraded to X13 and had an opportunity to evaluate how it performs on your existing system. From what I can see/understand the only significant consideration relates to whether or not you feel a desire to Raytrace in Real Time. If so then you will definitely need a reasonably powerful RTX 3000 series graphics card. All I can say is that from what I have seen so far I'm not convinced that CA's Real Time Raytracing in itself is worth spending that kind of money on.

 

Agreed!

Link to comment
Share on other sites

16 hours ago, mhdjawwad said:

If you really want a new build focus on this:

 

           1) A better CPU (high base clock speed NOT boost speed). Intel i9 10xxx and AMD 59xx are good choices. Wanna go real crazy? Go for the AMD ThreadRipper.

           2) A minimum 1 TB SSD drive.

           3) A minimum of 64 GB 3400 RAM. I recommend 64GB because RAM is dirt cheap these days.

           4) A low end RTX 2060 now or RTX 3060 (early next year) will do. I only recommend this because these cards are cheap.

 

Just interested in the reasoning behind some of these recommendations. Maybe you are discussing CA in combination with a third party renderer, if so it would be best to separate these out so readers can differentiate between the two.

 

Boost Speed - this is an important specification as it tells you the CPU's maximum designed frequency. A CPU with a low boost speed will be slower on average than one with higher boost. Yes base is important but it should not be taken on it's own. If you overclock then the boost speed is very important as this is a strong indicator as to how fast you will be able to get the CPU to run on a continuous basis.

 

SSD Drive - Primary drive should be an NVMe type. About 10 X faster than a generic SSD.

 

Minimum 64 GB RAM - Though extra RAM will do no harm, there is no real benefit to having significantly more than needed. In the context of CA, for the majority of users16GB is more than adequate, you really only need more if you are running some other software that can benefit from it.

 

Concerning the AMD Threadripper, The only real function in CA that can take advantage of those cores is the CPU Raytracer, if this is not an important part of ones workflow then most of those cores will be of no benefit.

 

I believe it is important when making these recommendations to maintain a high level of context to avoid leaving a reader with the impression that CA will only function on very high spec'd systems. This is very misleading and could actually inadvertently deter a potential purchaser of CA if they are under the impression that they will need to spend $4,000 or $6,000 plus the cost of CA. There are many users that have no need or desire to generate photo realistic renderings and as such their system configuration will be significantly different than those that have this need.

Link to comment
Share on other sites

1 hour ago, mhdjawwad said:

A higher base clock speed simply means you get to do more without overclocking with less noise/heat.

 

Apologize for belaboring the point, but based upon this one could therefore conclude that an Core i5-10600K with a base clock of 4.1GHz is a better choice than say a Ryzen 9 5950X with a base clock of only 3.4 GHz. Even an i3-8350K with a base clock of 4.0 GHz beats the Ryzen. I understand that this is not what you are intending to state but the way it's written this is what you are stating. Even if you do not purposely overclock, the processor is still being overclocked, that's what the boost is and that is crucial to a CPU's performance. That Ryzen can boost to 4.9 GHz which means it can process about 25% faster when boosted versus running at it's 3.4 GHz base clock. With a decent cooling system you can clock this to run at it's boost speed all of the time.

 

 

Link to comment
Share on other sites

14 hours ago, mhdjawwad said:

Here is a simple breakdown for what I mean by a 'better CPU and higher base clock speed':

 

Capture.thumb.JPG.b51dd467e7cba513c2b960db1310ec29.JPG

 

Same manufacturer of CPU (Intel). Just pick the 600 one (last one of the second row). Nice base clock speed - not going to break

the bank either.

 

 

 

Agree that the Intel I9 10900KF is a great processor. The question is, is it the base clock speed or it's turbo speed capability that makes it great. Without it's ability to boost to 5.3 GHz this processor would rank much lower. I understand that you are adverse to the terms Boost & Turbo, but these technologies are critical in maximizing a CPU's through put. They are stated separately not as marketing fluff but to define the difference between the frequency that the CPU is designed to run continuously across all cores versus the frequency it can run under shorter durations. If you scroll down that Dell spec/option sheet you will also notice that choosing that processor requires their liquid cooling option and if you are familiar with these systems Dell provides Overclocking software, just open it up and turn it on, no need to play with voltages. Now that puppy will likely run near it's peak all the time across all cores.

Link to comment
Share on other sites

1 minute ago, mhdjawwad said:

 Its icing on the cake. Its like the year end bonus that employees get to trick them into thinking they are really contributing to the company.

 

    I would like a CPU to do more without pushing itself. That 3.7 GHz will do a lot more out of the box - I know I have that speed available even for the most mundane tasks. I will take the extra boost icing, I don't know when I will get it but I will take it however.

 

You really should not see it that way. This is an intended design feature, every processor manufacturer uses it, it's how they are designed to work. It's the boost that's giving you a great experience not the 3.7 GHz. 

Link to comment
Share on other sites

5 minutes ago, mhdjawwad said:

  In this case it's more than 3.7 GHz that attracts me. The cores and the cache - those are big deal. If you have one of these CPUs I

invite you to disable Turbo Boost and see how well your system will respond.

 

          Turbo Boost on a already hot system is of little value, I doubt it will kick in and last long enough to be of real benefit. More than likely

you will see a massive improvement even for an application like CA. Even for games, I recommend to please disable Turbo Boost, it will

improve performance by leaps and bounds.

 

           Liquid cooling is great but not meant to provide Turbo Boost for longer periods or to allow it. It's meant to do the opposite, to prevent

the CPU from going into Turbo Boost in the first place regardless of what the manufacturer is saying. They advertise in that manner because people

are programmed to believe a boost will be better for their machines. 'Our CPUs can run cooler!' - this is what they should advertise. The CPU

market is heavily influenced by the gaming industry so the boost keyword is used repeatedly. It's marketing at it's ultimate best.

 

        That 3.7 GHz, Cores and Cache is better than all the boost they are advertising.

 

Well I'm honestly at a loss for words. Not sure what your source of technical information is but it's definitely not the same as mine.

 

Cheers!!!

Link to comment
Share on other sites

I was in need of a new desktop computer and I'm at least glad I knew about the changes coming to CA.  My existing is pushing 7 yrs, so I'm ok investing a little to take advantage of the software advances.

 

I just purchased a MSI with the following specs:

Intel Core i7-10700KF - 8 Core (3.8 GHz - turbo 5.1 GHz)

16GB DDR4 Memory (upgradeable to 128GB)

NVIDIA GeForce RTX 3080

1TB SSD

MSI Z490 Motherboard

Liquid cooled

 

Not the top end, but I think it will work well with option to add more RAM.  Will be setting up this week, so I'll know more later.

 

BT

Link to comment
Share on other sites

10 hours ago, TheKitchenAbode said:

Well I'm honestly at a loss for words. Not sure what your source of technical information is but it's definitely not the same as mine.

 

Its similar to mine.  I feel like you are simultaneously oversimplifying the concept of boost speeds and overestimated the number of users who will (or even should) be overclocking, and that you're kinda intermingling the 2 when you should probably focus on one or the other.

 

First off, I'd venture to guess that less than 1% of Chief users truly overclock their systems and even those that do are more than likely reducing the lifespan of their processor, so that concept should hardly be worth even mentioning. 

 

More specifically with regard to OOB boost speeds though, it should be stressed and stressed heavily that those speeds are only ever attained under just the right circumstances (when there's enough available power, when the temperature is low enough, and when the system feels the "need" is great enough), for short time periods, and for all intents and purposes only truly apply to a single core.  A dusty air cooled machine in a warm room may never see the computer boost at all and even a well maintained liquid cooled machine may see very little benefit during a ray trace that's set to run for several hours on all cores.  Plus, I'd argue that just because there isn't a load on the system doesn't mean processes wouldn't benefit from running at a higher base clock speed...a speed that can be maintained by design and OOB 100% of the time as opposed to the aforementioned boost speeds that are very situationally dependent.  There are obviously many other factors to consider as well but the boost speed should be considered with a grain of salt for sure...kinda like comparing an engine with nitrous to one without.  Yes the nitrous will provide more horsepower at the top end, but when, for how long, and at what cost?

 

 

Link to comment
Share on other sites

For folks not following the theoretical discussion here are visuals of real life readings using Chief with a 2.3 GHz (Max Turbo Frequency 5.1 GHz)  processor.

Simplified-look at the right hand column.

Utilization-processor show s how much each thread on the CPU is used.

Clocks shows the spead each core has reached during the processes done.

Middle column- when shows 0% it means I reset the view to show ONLY what was used in the described process, the plan file was already open.

All settings for it are OOB other than Windows power profile set to performance with processor set to 100% min and max when plugged in. Only Chief is open on the computer.The plan for the first 3 is just the shell and interior only 7 MB file.

 

image.thumb.png.125e11e36711b84c77b2d6109ee64318.pngimage.thumb.png.757b89f03d51e22a7c8736b8bb53926b.pngimage.thumb.png.0cdb11f16fc0956a0626f3f1f521e1a4.png

The next two are working with an 18MB file, complete structure.

image.thumb.png.013d90bba0621c9023b598182eaf17bc.png

The last one is running an OOB Interior Ray trace for 10 or 11 passes. The the processor is running at the minimum MHz and all cores are pegged at 100% the entire time.

image.thumb.png.ccd384160cf118cd030a067860166e6f.png

What I get from this is using OOB settings on machine-A Chief manages to use most cores and threads often; B- during general work in chief- the processor runs most of the time above the minimum clock speed but rarely reaches full turbo boost; C- The only time the processor runs at minimum speed is during CPU based Raytracing when all cores run at 100% until done (on this processor anyway)

 

On 12/24/2020 at 5:37 PM, VisualDandD said:

Love to hear ideas of what you guys might go with. 

I think Rich hit it right off. It's great that Chief has come forward with as much advice here as they have AND offers that we can call and check with them about what we are considering.

I agree it's worth waiting to see what X13 brings but my hunch form what I have seen is that it may actually perform in some ways better than 12. FWIW I had talked with a few developers at IBS last year and again over the summer before I got this laptop. Both times I was told was to get an RTX card. Also the they had bought RTX 2070 cards for the developers on the rendering team which lets you know what they were working with for most of the time until this fall.

 

If I were getting a desktop to use for the next 5 yrs (which may be) I would wait to see where things falls after the 11th generation Intel CPU is out. Then both AMD and Intel will have PCIE4 which may not be big now but matters in the long run IMO... and we will have seen both the new AMD and NVidia cards in use for a bit. Then I'd decide.

.. for today-top end i7; 32GB Ram clock to match processor; RTX3070; fastest NVMe I can afford; MOBO w/PCIE4, USB-c 3.2 & Thunderbolt 4.

  • Upvote 1
Link to comment
Share on other sites

5 hours ago, Alaskan_Son said:

 

Its similar to mine.  I feel like you are simultaneously oversimplifying the concept of boost speeds and overestimated the number of users who will (or even should) be overclocking, and that you're kinda intermingling the 2 when you should probably focus on one or the other.

 

First off, I'd venture to guess that less than 1% of Chief users truly overclock their systems and even those that do are more than likely reducing the lifespan of their processor, so that concept should hardly be worth even mentioning. 

 

More specifically with regard to OOB boost speeds though, it should be stressed and stressed heavily that those speeds are only ever attained under just the right circumstances (when there's enough available power, when the temperature is low enough, and when the system feels the "need" is great enough), for short time periods, and for all intents and purposes only truly apply to a single core.  A dusty air cooled machine in a warm room may never see the computer boost at all and even a well maintained liquid cooled machine may see very little benefit during a ray trace that's set to run for several hours on all cores.  Plus, I'd argue that just because there isn't a load on the system doesn't mean processes wouldn't benefit from running at a higher base clock speed...a speed that can be maintained by design and OOB 100% of the time as opposed to the aforementioned boost speeds that are very situationally dependent.  There are obviously many other factors to consider as well but the boost speed should be considered with a grain of salt for sure...kinda like comparing an engine with nitrous to one without.  Yes the nitrous will provide more horsepower at the top end, but when, for how long, and at what cost?

 

 

 

It's certainly not my intention to over simplify this subject and I have no issues delving into this subject in much more detail. From the responses I have been receiving the most controversial appears to be related to Overclocking. I understand how this term can conjure up images of a computer being consumed in smoke and flames. It's important to realize that these horror stories relate to users who are attempting to obtain a world record by overclocking beyond the processor and cooling systems design parameters, this is not what I'm suggesting nor promoting.

 

Let's take a slightly more in-depth look at this. This discussion was originally initiated due to a difference in opinion as to base and boost frequency and it's relevance to CPU performance. I think we will all agree that a CPU's greatest threat is heat, as the frequency increases so does the evolved heat. This of course poses a very challenging issue due to the fact that frequency determines the throughput, the higher the frequency the higher the throughput, the higher the throughput the faster an instruction is processed. To address this heat limiting dilemma processor manufactures have developed a number of strategies. One of these relates to so called boost technology, this is based on the fact that as frequency increases it takes a certain amount of time for heat to evolve, this creates a window of opportunity where they can substantially increase throughput. To control this built in temperature sensors are used, if temps are below a certain level then boosting can take place, when temps reach a specified maximum the boosting is disengaged. Yes, this process does mean that the boost will have a time limitation and depending upon the processor it may only be for as little as 10 seconds or possibly 60 seconds. Though this seems like a very short period of time it's important to think about time from a computers perspective, in 10 seconds at say 3 GHz the processor will cycle 30 billion times, at 5 GHz it would cycle 50 billion times. It can be understood that this results in a significant improvement in throughput.

 

From this discussion so far it should be obvious that what determines the boosts time duration is heat and as such if the evolved heat could be dissipated at a greater rate then the boost duration could be sustained for a longer period of time and in fact if the heat could be dissipated at a rate that is greater than the heat evolved then the boost could be maintained indefinitely. This is the fundamental basis for the utilization of additional cooling, either air or liquid. As long as the CPU's temp can be held below a certain threshold the frequency can be increased up to the point where the heat removed is equal to the heat evolved.

 

It's now important to keep in mind that processor manufacturers tend to leave cooling solutions to other third parties and if they do include such it is typically just a basic air cooling solution. This is likely due to the fact they do not control the end use of the chip, that is determined by the computer system builder. As such the chip manufacturer tests and provides base and boost frequencies that can be relied upon/guaranteed, in essence they represent the minimum design performance. Certainly within this specification there is data that indicates the potential maximum design performance. This is derived by knowing the maximum design operating temp and the maximum design frequency. If we use Dell Alienware as an example we can see that Dell designs it's own cooling solutions and they also provide their own software to control this. This included software is what is commonly considered as Overclocking software as it provides the ability to adjust both CPU and cooling parameters in order to maximize CPU through put. Though a user can go in there and tweak the settings, Dell provides factory settings that are specifically set that will maximize throughput while ensuring that the maximum CPU design specs are respected so that using this will not undermine the CPU's longevity. The advantage of using this type of overclocking is that it allows the processor to run at a sustainable speed that is higher than it's base clock speed, typically this will be just a bit below the maximum boost speed. The key here is that this new speed is sustainable, it does not cycle up and down like a boost, essentially this new speed becomes the CPU's base clock speed. The purpose of overclocking is to overclock the base clock speed not he boost speed.

 

So in conclusion, yes the base clock speed is important but in a roundabout way. Ideally you want the CPU to run as fast a base speed as possible, it's ability to do this is determined by the efficacy of the cooling solution, the boost provides a partial solution and also demonstrates what the base clock speed could be with the right cooling solution.

 

Though I have focused on frequency I am fully aware that there are also other performance related factors such as core count, hyperthreading, cache size and more. These all come into play and are certainly worthy of discussion. Best for another day as they need to be discussed individually in order to gain a perspective as to how they all work to together.

 

Cheers

Link to comment
Share on other sites

4 hours ago, TheKitchenAbode said:

The advantage of using this type of overclocking is that it allows the processor to run at a sustainable speed that is higher than it's base clock speed, typically this will be just a bit below the maximum boost speed. The key here is that this new speed is sustainable, it does not cycle up and down like a boost, essentially this new speed becomes the CPU's base clock speed.

 

Yes.  If you're talking about a system properly overclocked from the factory then we're no longer comparing apples to apples.  As you've stated, you essentially just have a higher base clock speed.  For all other overclocking, Dell's own recommendations state it well...

 

"Alienware supports overclocking only when you order the system overclocked from factory.

Please refer to your invoice or go to dell.com/support to check if your system was factory overclocked

.

Important: Altering clock frequency or voltage may damage or reduce the useful life of the processor and other system components, and may reduce system stability and performance. Product warranties may not apply if the processor is operated beyond its specifications." 
Link to comment
Share on other sites

LOVE the discussion.  And appreciate all the thoughts.

 

I am diving in, and a few things have changed since the last box I built.


Seems like Hard Drives for speed have gone to PCI-E.

 

I have 3 SSID's 2-1tb and 1-2tb.   I also have a 'hot swap' bay on the front which I love and hard to find a new case with nice external bays that also has front panel USB-C.


I did find one though. I run a software continuous back-up to a 2-gb conventional drive.  It is not true RAID but rather software driven.

 

Looks like vid cards are going to be in short supply till Feb/april.    So PART of my motivation is 2-fold.   My 13 yo son has been bugging me for a gaming PC.  This box I built 5 years ago is still pushing 100FPS on most of his games.   For a 5 yo box, the specs have aged well.

 

I74790k which I use a mild overclock done by my MSI motherboard automatically.   I have a good cooler

GTX 970 card w/ 4gb   Amazingly it is still on the list of Toms hardware best cards for 2020.  (although at the lower end...but not bad for 5 years!)

24gb memory.

 

So my thoughts were to give this to my son and build a new box.  I dont see using Chief Ray trace as anything that really motivates me since when I need to render, I use Lumion or I have a Twin Motion full license.  But 90% of my "renders" done for practical purposes are either 'technical drawing' or 'water color' w/ line on top.   That does all I need it to do.   ANy more and I have have to spend a bunch of time 'cake decorating' a scene which is not worth it for me.

 

I am definitely following along on this thread.  It looks like the earliest I will do a build will be March April which at that point, I might as well wait to have X13 come out and see what everyone thinks.

 

 

Link to comment
Share on other sites

7 minutes ago, Alaskan_Son said:

Yes.  If you're talking about a system properly overclocked from the factory then we're no longer comparing apples to apples.  As you've stated, you essentially just have a higher base clock speed

 

Not sure why I would be suggesting doing something improperly.

 

Ok, so we at least have some degree of agreement that overclocking, providing it is done properly, is valid and it will not be detrimental to the CPU. I'm taking a bit of a leap here but I think we can also say that overclocking, when done properly, will also result in a significant improvement in system performance.

 

Maybe we could also go so far as to say that boost/turbo technology also has the potential to improve system performance when cooling is done properly.

 

How about we now open up a discussion on hyperthreading?

 

Link to comment
Share on other sites

5 hours ago, mhdjawwad said:

 

    if you can wait, wait till the end of January specially for a laptop. The RTX 3060 and above are coming to MSI soon. The performance has doubled and the cost from the looks of it won't be blown out of proportion. RTX 3080 and RTX 3090 have been available in Alienware for some time now. Keep in mind that the laptop versions of these cards won't be the same as the desktops ones but they will still be much better than the current generation of RTX cards.

 

   You are right, if you don't care about rendering a low end RTX card like 2060 or RTX 3060 will suffice. This will give you CA's ability to use these cards while still not blowing the cost out of proportion.

 

   CA's rendering is not upto the mark. There are many issues I found with it while I am trying to build an automated workflow from CA to a actually getting a realistic render. I posted my findings earlier in the following thread in Suggestions:

 

https://chieftalk.chiefarchitect.com/topic/29031-rendering-observations-and-lighting-suggestions/

 

   I might add to that when I dig deeper into the rendering side of things.

 

 

 

 

 

 

 

Yep, I am in a 'wait and see' mode.   I refuse to work off a lap top ;)   About 10 years ago, I stopped buying laptops that would run chief.   When I leave for vacation, I have no choice to leave work at home.


Plus, my workflow for years requires a desktop.  I run 3 monitors, 3d mouse, programmable game pad w/ all my hot keys and a 'normal' mouse and keyboard.


I cant imagine trying to work off a laptop.

 

I am going to start brushing up to speed on the latest PC stuff.   I need to come up to speed since it has been so long since I built one.


GREAT news is I have have some stuff I can 'upcycle'.   I still have about 3 or 4 gold 1000w modular power supplies from building mining rigs back in the day ;).   At least I can put one of them to use!

 

 

  • Upvote 1
Link to comment
Share on other sites

Been using 3d Connection mouse since it first came out in 2015.


I could NEVER go back.  It has changed the way I draw.  Mainly VASTLY improving 3d editing capabilities, being able to select and move model in 3d greatly reducing redundant edits.   EG, select a bunch of windows...

 

Here is a vid I did of how nice a 3d mouse works.

 

 

Link to comment
Share on other sites

5 minutes ago, solver said:

 

They make a bunch of mice -- which one are you using?

 

And for anyone watching, Chief has the ability to pan around in 3D while maintaining a selection.

It is the first one they released.  3dx-700028  SpaceNavigator.   I dont think they make it any longer.  Looks like the most similar is the "compact one" except the one I have is full size but only 3d mouse.

 

 


I dont need all the buttons because I also run a Logitech G13 keypad along side it.

 

I put labels on it so I have it programmed as a 10 key with ' and " keys to make manually entering dims easy.  I also have a different mode that I use for roofs and cad which changes the programming of the keys to common functions, like point to point move, and connect roof plans, break cad....etc.   Not sure they still make this one either ;)  but it is part of my work flow now....

 

I also use a logitech gaming mouse with 12 programmable keys and on the fly dpi switching.    Basically I plug the computer right into my brain ;) 

 

 

vPUWXGt.jpg

 

 

Link to comment
Share on other sites

17 minutes ago, solver said:

hey make a bunch of mice -- which one are you using?

 

And for anyone watching, Chief has the ability to pan around in 3D while maintaining a selection.

 

My wife bought me the SpaceMouse Compact for Christmas.  Its fabulous, not only for working in 3D but for working in 2D views as well.  Very easy to click and draw with my right hand while I pan and zoom with my left.  I will say this for anyone thinking of getting one...One of the very first things I thought after playing with it for a bit is that I wish the mouse had the Escape key, Control, Tab, and maybe Shift keys right on it.  I was pretty unfamiliar with these mice and did a quick search on the other models.  Sure enough, the SpaceMouse Pro has most of those keys right there on it plus a few extra programmable keys.  I say all that to say this.  If you're going to make the jump, the Compact version is worth the money for sure, but I would really recommend going straight for the SpaceMouse Pro.  You'll be able to keep working a lot more often without having to let go to reach for the keyboard. 

Link to comment
Share on other sites

1 minute ago, Alaskan_Son said:

 

My wife bought me the SpaceMouse Compact for Christmas.  Its fabulous, not only for working in 3D but for working in 2D views as well.  Very easy to click and draw with my right hand while I pan and zoom with my left.  I will say this for anyone thinking of getting one...One of the very first things I thought after playing with it for a bit is that I wish the mouse had the Escape key, Control, Tab, and maybe Shift keys right on it.  I was pretty unfamiliar with these mice and did a quick search on the other models.  Sure enough, the SpaceMouse Pro has most of those keys right there on it plus a few extra programmable keys.  I say all that to say this.  If you're going to make the jump, the Compact version is worth the money for sure, but I would really recommend going straight for the SpaceMouse Pro.  You'll be able to keep working a lot more often without having to let go to reach for the keyboard. 

Michael


I have all those keys programmed on the side of my 'normal' mouse.   Tab is SUPER effective as you dont realize how many times you press it!  I also have the open dbx hot key on there.  I dont use all 12 keys, but eventually I thought about adding in some more key binds but I already have the extra keyboard.


Having access to those important functions though is critical when using the 3d mouse so binding them to your 'normal' mouse works well also!

 

Also has on the fly DPI switching which I use from time to time making it very nice!

 

https://www.amazon.com/Logitech-Gaming-Backlit-Programmable-Buttons/dp/B0086UK7IQ/ref=sr_1_2?dchild=1&keywords=logitech+g600&qid=1609293472&s=electronics&sr=1-2

Link to comment
Share on other sites

6 minutes ago, VisualDandD said:

Michael


I have all those keys programmed on the side of my 'normal' mouse.   Tab is SUPER effective as you dont realize how many times you press it!  I also have the open dbx hot key on there.  I dont use all 12 keys, but eventually I thought about adding in some more key binds but I already have the extra keyboard.


Having access to those important functions though is critical when using the 3d mouse so binding them to your 'normal' mouse works well also!

 

Also has on the fly DPI switching which I use from time to time making it very nice!

 

https://www.amazon.com/Logitech-Gaming-Backlit-Programmable-Buttons/dp/B0086UK7IQ/ref=sr_1_2?dchild=1&keywords=logitech+g600&qid=1609293472&s=electronics&sr=1-2

 

Ya, I have some programmable buttons on my normal mouse as well.  I don't really like them all that much though.  Would much rather have them over on my left hand side where I'm already accustomed to hitting them.  Besides, it would also allow me to hit and/or hold those buttons while I use the number pad, arrow keys, and/or the enter key with my right hand. 

Link to comment
Share on other sites

7 minutes ago, Alaskan_Son said:

 

Ya, I have some programmable buttons on my normal mouse as well.  I don't really like them all that much though.  Would much rather have them over on my left hand side where I'm already accustomed to hitting them.  Besides, it would also allow me to hit and/or hold those buttons while I use the number pad, arrow keys, and/or the enter key with my right hand. 

 

Gotcha!

 

On a tangent, I have played with the idea of programming one of these! (link below) 

 

I think it could be a very powerful tool with child buttons...etc.   Put a 10 key in there with ft and in along with all the edit functions.

 

But who am I kidding.   I keep my SSA up to date, and I have never even loaded X12 yet (srs).    Busy year was busy!

 

https://www.amazon.com/Elgato-Stream-Deck-XL-customizable/dp/B07RL8H55Z/ref=sr_1_1?dchild=1&keywords=elgato+stream+deck+xl&qid=1609294205&sr=8-1

Link to comment
Share on other sites

Hey Justin. While you are mulling over all the

hardware options don't neglect to find a case

that will hold it all.

 

I recently opened up a rig I was using in 2005

to run AutoCad and Solidworks. Check out the

size of the video card back then.

IMG_3245.thumb.JPG.fcdfacb167a3f17c118d320b076a7377.JPG

How about that cooling system - a single 1 1/4" fan.

 

Fast forward 15 years and the video card has

gone from 6" x 2 1/2" x 1/2" to:

GTX 780 & 1080  10 1/2" x 4 3/8" x 1 1/2"

RTX 3080  11 1/4 x 4 3/8 x 1 1/2"

 

RTX is another 3/4" longer than the 780 & 1080

which were already pushing the capacity of a lot

of cases. If we extrapolate out another 15 years

the video card will be the size of a suitcase. :D

 

 

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share