SteveT

Members
  • Posts

    42
  • Joined

  • Last visited

Reputation

0 Neutral

Recent Profile Visitors

1624 profile views
  1. Okay folks, this has been a really interesting thread and I'm grateful for the discussion my question spun off. However, the "solution" to my "problem" is that I wasn't using the product correctly: I didn't understand that the PBR-RTRT setting means you move your camera around and then just sit there and the rendering continues in the background. I was moving my camera around, waiting a few seconds, and concluding that it looked a bit junky. By accident I walked away from my computer for a moment and came back and it was rendered in a way similar to CPU-based RT and looked pretty good (although annoyingly different, but that's what you'd expect from two different RT engines). I then noticed the (tiny) counter in the tray display that described how many passes there had been (I was getting 5/sec for a pretty complicated plan if that's any good I don't know). Maybe CA might want to highlight that innocent little counter down there a little more in the UI to let people new to this feature know there's something going on :-). Steve
  2. I have a PC available to me to do testing with CA13, so yeah, I'm doing PBRRTRT :-). My PC has an RTX 2070. Maybe not quite the super card you guys have, but I would think I'd get the same results you'd get but a tad slower. But I'm getting nothing of the sort :-(. I'll take a look at Rene Rabbit's thread above. Maybe I'm doing this all wrong...
  3. I actually don't know what any of those acronyms mean. Maybe that's why your post doesn't make sense to me? When I used hardware raytracing, the results... sorta sucked. And I have a pretty darn recent card (that peaked out at $1500 during the pandemic).
  4. I'll just pile on here and concur: I was very excited about hardware raytracing until I saw that the quality is nowhere near what you get with CPU-based raytracing. If they got these on par with each other then we might have something. Until then this is a waste of time. And since I'm "not that kind of architect" (software, 30 years), I'd express a fear that the quality may never get to the level of CPU raytracing because that's not what these cards are meant to do: they are made for gaming which is a realm where "almost" is perfectly fine. I could be wrong here, but I have a bad feeling about this.
  5. This seems like a simple request, so maybe I'm missing something. I want to draw a cube-shaped object but paint each surface of the cube separately instead of all six sides being the same texture. What am I missing? Steve
  6. I brought this issue up in the thread below a few months ago, and it seems like a lot of people are having the same problem: for larger plans, after a while of usage, the "Building 3D Model" message comes up when you perform any UI action even though you aren't building a 3D model such as bringing up a dialog box for an object. For a larger plan this add hours of lag time in aggregate. For me, downgrading/upgrading didn't seem to change the problem. Since the problem is intermittent, I haven't figured out how to perfectly reproduce the problem with a simple example and exact instructions. I am working on this but it will take some doing. Steve
  7. Thanks everyone for the replies. This has been very helpful.
  8. Yes, that's perfect. But you are right, this is probably a bad idea. I need to re-think my design. Thank you very much for your help. Steve.
  9. Hello folks, I'm working in X11 and I my plan from the front is a building with three equal part with the the left and right parts bumping out about five feet (see attached: the front is on the right side of the plan). I want the front profile to basically look like this: .^^^. looking at it from the front. In other words, I want it to be as if I had three separate buildings that each had full gable roofs, but one building with the roofs merging into each other. Is there any way to do this with the automatic roof generator? Thanks for your help, Steve
  10. Apple's future is it's M1 chip, which most of its new systems have and they are clearly looking to migrate away from other chips. The M1 has the GPU on the same chip, which is incredibly efficient from a system perspective. Apple does in fact have a ray tracing API which exploits the GPU hardware on the M1: https://developer.apple.com/videos/play/wwdc2020/10012/ Hopefully a future version of CA will use this API. That said, it's unlikely that the Apple M1 will ever come close to a dedicated GPU card like the 3080 (and they cost $2000 right now). Still, it would be nice if CA would actually use the hardware that is available to them for those of us who don't want to switch platforms.
  11. I am not having any issues, just trying to get an idea of when the feature I want will be on the Mac product.
  12. Nope, never tried that. I just use a Mac.
  13. I looked and looked and couldn't find that thread. Do you know where the info is on that? If it's a bug in Apple's software I'd like to know where so I can know when to expect a fix.