TheKitchenAbode

Members
  • Posts

    3070
  • Joined

  • Last visited

Everything posted by TheKitchenAbode

  1. Editing a plan does not effect the Raytrace currently being run. If you wish those edits to show you will need to start a new Raytrace.
  2. I realize there are other ways to accomplish this. The method I used seemed to be the most logical, create an opening in the floor, place a moulding polyline to the openings edge and then place railings around the opening. Everything would flow off of the floor hole polyline. Obviously as I found out is that the hole polyline represents the rough opening. No problem as if I wish to continue to do it that way I can always generate a cad polyline set inside the hole to represent the drywall finish and then use it to snap everything to.
  3. I used the hole in floor as I was creating a large open wrap around balcony, did not want a bunch of invisible walls surrounding the opening. Besides placing railings around the opening I was also placing a moulding polyline to apply a bullnose overhang to the hardwood flooring. That's when I noticed it, I drew the moulding polyline snapped to the hole in floor polyline and the molding did not locate properly.
  4. Exactly. If I'm attempting to use the hole in floor polyline to align items to the finished opening I have to add an offset equal to the vertical drywall finish. Just wondering if there might be a setting somewhere so when I draw the hole in floor polyline it is actually the size of the finished opening.
  5. If I look at your cross section view it seems that your dimension is to the framing face not the exposed drywall finish.
  6. I did that initially as I thought that was where the problem was but it's the fact that the floor plan polyline for the hole in floor is not sized to the finished opening, it seems to be sized to the rough opening.
  7. Surfaces, but it is not related to that as I always manually place my dimensions. The issue is that the floor in hole polyline is actually the rough opening, not the finished opening.
  8. Just noticed this and checking to see if others have experienced this. Placed a hole in floor on a second floor. Dimensioned it in plan view. Then took a cross section and dimensioned it. The plan view polyline dimension is off by the thickness of the openings vertical drywall finish. Encountered this as I was attempting to align some railings in plan view to be perfectly flush with the opening by snapping to the floor opening polyline.
  9. The primary hardware involved in 3D views is the CPU & Video Card and unfortunately neither of these can be upgraded in your HP All-in-one. I .think as has been suggested is that your best bet is to save up some funds and purchase a better overall system. To keep costs down you could consider a refurbished or open box system or possibly a used gaming desktop such as a prior generation Alienware Aurora.
  10. Are you saying that this only happens with a new custom material? In other words if you use a standard CA material then it shows correctly.
  11. Unfortunately Chief does not save any data with the pic, would be great if it did. Concerning how a particular pic is attained is dependent on a number of variables, sun settings, light source settings, material properties and the available settings in the Raytrace DBX. As you can deduce, this can amount to a lot of variables that can impact on your pics quality and the look you wish to achieve. Sorting through this can be daunting and I would suggest you search this forum as there are numerous discussions on this and many users have attempted to provide some guidance on their approach. A great way to obtain some specific recommendations is to post an example of your output.
  12. If you are running CA X12 you have the option to Pause your Raytrace and then Resume it, there is a pause and play icon. To fully terminate it you close the Raytrace window, a pop-up will ask for confirmation. You can save your scene at anytime, even while it is Raytracing. Concerning as to when is it done, there is no defined done, that's up to you to decide if and when the scene looks right. However, it's important to understand that just running 100's or 1,000's of passes does not guarantee a high quality pic. The quality of the pic is primarily determined by your lighting and material property settings and the Raytrace DBX settings. Personally, if after say 30 passes your scene is not looking all that good I would suggest revisiting those settings.
  13. That's the benefit of the user library, we can structure it and use it according to our personal needs, there's is no right or wrong way.
  14. I guess you could do so but it seems a bit redundant as within each plan you can open up the material DBX and see this information at least for the materials. Personally I only use the user library for my most frequently used items or as a temporary type of cache if I want to use the replace from library tool or when I need to convert an imperial object from a former plan to metric in another. The other issue of course is that overtime your user library is going to become enormous with all those folders and sub-folders.
  15. Not sure why I would be suggesting doing something improperly. Ok, so we at least have some degree of agreement that overclocking, providing it is done properly, is valid and it will not be detrimental to the CPU. I'm taking a bit of a leap here but I think we can also say that overclocking, when done properly, will also result in a significant improvement in system performance. Maybe we could also go so far as to say that boost/turbo technology also has the potential to improve system performance when cooling is done properly. How about we now open up a discussion on hyperthreading?
  16. It's certainly not my intention to over simplify this subject and I have no issues delving into this subject in much more detail. From the responses I have been receiving the most controversial appears to be related to Overclocking. I understand how this term can conjure up images of a computer being consumed in smoke and flames. It's important to realize that these horror stories relate to users who are attempting to obtain a world record by overclocking beyond the processor and cooling systems design parameters, this is not what I'm suggesting nor promoting. Let's take a slightly more in-depth look at this. This discussion was originally initiated due to a difference in opinion as to base and boost frequency and it's relevance to CPU performance. I think we will all agree that a CPU's greatest threat is heat, as the frequency increases so does the evolved heat. This of course poses a very challenging issue due to the fact that frequency determines the throughput, the higher the frequency the higher the throughput, the higher the throughput the faster an instruction is processed. To address this heat limiting dilemma processor manufactures have developed a number of strategies. One of these relates to so called boost technology, this is based on the fact that as frequency increases it takes a certain amount of time for heat to evolve, this creates a window of opportunity where they can substantially increase throughput. To control this built in temperature sensors are used, if temps are below a certain level then boosting can take place, when temps reach a specified maximum the boosting is disengaged. Yes, this process does mean that the boost will have a time limitation and depending upon the processor it may only be for as little as 10 seconds or possibly 60 seconds. Though this seems like a very short period of time it's important to think about time from a computers perspective, in 10 seconds at say 3 GHz the processor will cycle 30 billion times, at 5 GHz it would cycle 50 billion times. It can be understood that this results in a significant improvement in throughput. From this discussion so far it should be obvious that what determines the boosts time duration is heat and as such if the evolved heat could be dissipated at a greater rate then the boost duration could be sustained for a longer period of time and in fact if the heat could be dissipated at a rate that is greater than the heat evolved then the boost could be maintained indefinitely. This is the fundamental basis for the utilization of additional cooling, either air or liquid. As long as the CPU's temp can be held below a certain threshold the frequency can be increased up to the point where the heat removed is equal to the heat evolved. It's now important to keep in mind that processor manufacturers tend to leave cooling solutions to other third parties and if they do include such it is typically just a basic air cooling solution. This is likely due to the fact they do not control the end use of the chip, that is determined by the computer system builder. As such the chip manufacturer tests and provides base and boost frequencies that can be relied upon/guaranteed, in essence they represent the minimum design performance. Certainly within this specification there is data that indicates the potential maximum design performance. This is derived by knowing the maximum design operating temp and the maximum design frequency. If we use Dell Alienware as an example we can see that Dell designs it's own cooling solutions and they also provide their own software to control this. This included software is what is commonly considered as Overclocking software as it provides the ability to adjust both CPU and cooling parameters in order to maximize CPU through put. Though a user can go in there and tweak the settings, Dell provides factory settings that are specifically set that will maximize throughput while ensuring that the maximum CPU design specs are respected so that using this will not undermine the CPU's longevity. The advantage of using this type of overclocking is that it allows the processor to run at a sustainable speed that is higher than it's base clock speed, typically this will be just a bit below the maximum boost speed. The key here is that this new speed is sustainable, it does not cycle up and down like a boost, essentially this new speed becomes the CPU's base clock speed. The purpose of overclocking is to overclock the base clock speed not he boost speed. So in conclusion, yes the base clock speed is important but in a roundabout way. Ideally you want the CPU to run as fast a base speed as possible, it's ability to do this is determined by the efficacy of the cooling solution, the boost provides a partial solution and also demonstrates what the base clock speed could be with the right cooling solution. Though I have focused on frequency I am fully aware that there are also other performance related factors such as core count, hyperthreading, cache size and more. These all come into play and are certainly worthy of discussion. Best for another day as they need to be discussed individually in order to gain a perspective as to how they all work to together. Cheers
  17. Well I'm honestly at a loss for words. Not sure what your source of technical information is but it's definitely not the same as mine. Cheers!!!
  18. You really should not see it that way. This is an intended design feature, every processor manufacturer uses it, it's how they are designed to work. It's the boost that's giving you a great experience not the 3.7 GHz.
  19. Agree that the Intel I9 10900KF is a great processor. The question is, is it the base clock speed or it's turbo speed capability that makes it great. Without it's ability to boost to 5.3 GHz this processor would rank much lower. I understand that you are adverse to the terms Boost & Turbo, but these technologies are critical in maximizing a CPU's through put. They are stated separately not as marketing fluff but to define the difference between the frequency that the CPU is designed to run continuously across all cores versus the frequency it can run under shorter durations. If you scroll down that Dell spec/option sheet you will also notice that choosing that processor requires their liquid cooling option and if you are familiar with these systems Dell provides Overclocking software, just open it up and turn it on, no need to play with voltages. Now that puppy will likely run near it's peak all the time across all cores.
  20. Apologize for belaboring the point, but based upon this one could therefore conclude that an Core i5-10600K with a base clock of 4.1GHz is a better choice than say a Ryzen 9 5950X with a base clock of only 3.4 GHz. Even an i3-8350K with a base clock of 4.0 GHz beats the Ryzen. I understand that this is not what you are intending to state but the way it's written this is what you are stating. Even if you do not purposely overclock, the processor is still being overclocked, that's what the boost is and that is crucial to a CPU's performance. That Ryzen can boost to 4.9 GHz which means it can process about 25% faster when boosted versus running at it's 3.4 GHz base clock. With a decent cooling system you can clock this to run at it's boost speed all of the time.
  21. Just interested in the reasoning behind some of these recommendations. Maybe you are discussing CA in combination with a third party renderer, if so it would be best to separate these out so readers can differentiate between the two. Boost Speed - this is an important specification as it tells you the CPU's maximum designed frequency. A CPU with a low boost speed will be slower on average than one with higher boost. Yes base is important but it should not be taken on it's own. If you overclock then the boost speed is very important as this is a strong indicator as to how fast you will be able to get the CPU to run on a continuous basis. SSD Drive - Primary drive should be an NVMe type. About 10 X faster than a generic SSD. Minimum 64 GB RAM - Though extra RAM will do no harm, there is no real benefit to having significantly more than needed. In the context of CA, for the majority of users16GB is more than adequate, you really only need more if you are running some other software that can benefit from it. Concerning the AMD Threadripper, The only real function in CA that can take advantage of those cores is the CPU Raytracer, if this is not an important part of ones workflow then most of those cores will be of no benefit. I believe it is important when making these recommendations to maintain a high level of context to avoid leaving a reader with the impression that CA will only function on very high spec'd systems. This is very misleading and could actually inadvertently deter a potential purchaser of CA if they are under the impression that they will need to spend $4,000 or $6,000 plus the cost of CA. There are many users that have no need or desire to generate photo realistic renderings and as such their system configuration will be significantly different than those that have this need.
  22. Just my opinion but I still think it would be prudent to wait until you have actually upgraded to X13 and had an opportunity to evaluate how it performs on your existing system. From what I can see/understand the only significant consideration relates to whether or not you feel a desire to Raytrace in Real Time. If so then you will definitely need a reasonably powerful RTX 3000 series graphics card. All I can say is that from what I have seen so far I'm not convinced that CA's Real Time Raytracing in itself is worth spending that kind of money on.
  23. This style has become very popular in certain regions, especially in the surrounding Toronto region. Here is one we worked several years ago located in Oakville. Project 13-441 | BONE Structure This company is based out of Quebec. Some good examples of finishes. The exterior finishes are really straight forward, typically some form of printed metal siding or panel mixed with stucco and/or ledger type stone.