Page 15 of 29 FirstFirst ... 5131415161725 ... LastLast
Results 211 to 225 of 432
  1. #211
    Join Date
    Aug 1999
    Location
    Ridgway, Colorado, USA
    Posts
    2,917
    Quote Originally Posted by vintage58 View Post
    I am working on the specs for a new computer for X3. There is very little talk of AMD 1090T 6 core. All that I read on other sites say it is a great gaming cpu and will be better in the future as software takes advantage of the 6 cores. The price is very low, so has anybody considered this?
    I must start by saying I am in the same boat, trying to figure out the best bang for the buck in getting a new system. I will say that I have had very good service from my current (old) AMD dual core, which runs X3 OK except for a graphic driver glitch when I zoom in or out in 3D views.

    What I am hearing in this thread is that, while Doug Park said the more cores the better, in addition to how the program uses the additional cores, how the system is set up to utilize those cores will make a difference. This may, however, be more of a factor in notebooks where they are trying to keep the power requirements and excess heat down so they run at a slower speed until the demand is higher. Question is, what triggers the use of additional cores and higher speed? Based on what I have heard here, you want a system where the CPU runs over 2 Ghz at the low end.

    Short version is be careful of systems that have a CPU that run at a range from 1.66 to 2.8 Ghz or you may be stuck running at the lower speed for most things in Chief.

    Perhaps others have more/better information on this issue.
    Larry

    Lawrence C. Kumpost, Architect

    No matter how much you push the envelope, it'll still be
    stationery.

  2. #212
    Join Date
    Jan 2010
    Posts
    8
    So, does X3 support 6-core processors or not? I mean does it actually use all 6 of them or just 4? From my limited knowledge of PC, software MUST be designed to utilize a certain number of cores. So X3 either knows how to work with 6-core CPUs or it doesn't, in which case buying Phenom X6 for X3 will be a waste of money. Any official word on it?

    It would be very nice if someone benchmarked various PCs doing the same scene/rendering. Would be interesting to know what to spend money on - good CPU or good videocard.

  3. #213
    Join Date
    Apr 2004
    Location
    LOCKPORT NY
    Posts
    18,655
    I remember seeing posts by someone from CA stating "the more cores the better"

    while Chief may only use X cores you would usually have other software running and the OS itself etc

    might be a good idea to call tech support on Tuesday to discuss

    Lew
    Lew Buttery
    Castle Golden Design - "We make dreams visible"

    Lockport, NY
    716-434-5051
    www.castlegoldendesign.com
    lbuttery at castlegoldendesign.com

    CHIEF X5 (started with v9.5)

  4. #214
    Join Date
    Nov 2008
    Location
    New Zealand
    Posts
    706
    for my two cents worth I would go 4 cores and get a faster speed for similar money.
    Graeme Taylor

    currently loaded X3.1 & X4.2-64 bit & X5 64 bit
    also used v7 to x12
    AMD Phenom 2 black 980 3.7 GHz quad core
    8GB DDR3-1333 RAM
    NVIDIA GEForce GTX560 1024Mb graphics
    win 7 -64 bit
    2 x 24 inch monitors

  5. #215
    Join Date
    Jan 2000
    Posts
    4,161
    Chief supports 6 and more cores. Some algorithms, such as ray tracing scale well to as many cores as you can buy. Others don't scale as well so as more cores are added some things will speed up better than others.
    Doug Park
    Principal Software Architect
    Chief Architect, Inc.

  6. #216
    Join Date
    Oct 2009
    Posts
    9
    Tale of Two HP Laptops...

    X3 on Laptop #1 -- Screamin'. Awesome. Makes me dizzy.
    X3 on Laptop #2 -- 1' 45" to launch X3 to get to the Options screen. Freezes for 30" before it even thinks it can look at a 3D perspective.

    CA support said: "ATI Mobility Radeon is a chipset and won't work well with X3." Great...

    I'm confused as I see no such warnings on the forums. I'm about to dump Laptop #2. Any counter arguements saying Laptop#2 should work just fine?
    Thanks, Mark

    LAPTOP#1 -- HP dv7t. Intel i7 quad; 6Gb RAM; NVidea GeForce w/ 1G video ram.

    LAPTOP #2 -- HP dv6z. AMD Phenom quadcore @ 2GHz; 6Gb RAM; ATI Mobility Radeon HD5650 w/ 1G video RAM.

  7. #217
    Join Date
    Jan 2000
    Posts
    4,161
    Launch time depends on a lot of factors as follows:

    1) The program and dlls need to be loaded. The first time they are loaded DLLs are read from disk and cached in memory by the OS. Amount of memory and speed of the disk can make a big difference for this part.

    2) Libraries are loaded. Again on the first launch the OS will cache portions of the library file in memory so it will be faster later. Memory and disk speed are important factors. Size of your library is also important. If on one system the library has much fewer items the time to load will be less.

    3) We try to ping our server to validate your right to use Chief. This normally is very fast, less than a second, but under the right circumstances it can take quite a few seconds for the network connection to time out. A good connection or clear disconnection from the network make this fast.

    4) CPU speed. There is a fair amount of processing that occurs but this is normally a small percentage of the actual launch time which is mostly limited by disk speed.


    I don't have any experience with the recent AMD processors. Just because something has i7 on the cover is also not a guarantee of good performance. There are several flavors of this processor, some of which are targeted to low cost systems.

    The lower power mobile cards usually exhibit much worse performance regardless of vendor so you have to be careful of what you buy.

    Unfortunately, the specs on computers can be very hard to decipher so some research needs to be done before your purchase. When buying a laptop I would tend to look at laptops that are designed as gaming computers. These will generally have the better video cards and faster CPUs and will usually be somewhat more expensive.

    When you get into the lower cost systems a lot more care about what you buy is required so you don't end up with a lemon.
    Doug Park
    Principal Software Architect
    Chief Architect, Inc.

  8. #218
    Join Date
    Sep 2010
    Posts
    1
    when I open the trial version of x3?- I get the message that my video card is unable to give optimum performance- what does this mean-(I'm not very computer savvy-talk baby talk please)

  9. #219
    Join Date
    Nov 2008
    Location
    New Zealand
    Posts
    706
    it basically means the video card needs upgrading to a more powerful one, Are you on an old machine? if so it may be the whole machine needs replacing. check on the website for min spec reqd, then compare to your machine.
    Graeme Taylor

    currently loaded X3.1 & X4.2-64 bit & X5 64 bit
    also used v7 to x12
    AMD Phenom 2 black 980 3.7 GHz quad core
    8GB DDR3-1333 RAM
    NVIDIA GEForce GTX560 1024Mb graphics
    win 7 -64 bit
    2 x 24 inch monitors

  10. #220
    Join Date
    Jan 2000
    Posts
    4,161
    Chief takes advantage of video card hardware to improve the speed of 3D rendering. The technology in video cards is always changing and to get the most speed we take advantage of technology as it becomes available.

    While you will be able to use Chief on that hardware there are certain features that won't be available or will perform much worse than on more modern hardware.

    If you do decide to buy a machine to run Chief look at the gaming machines. While it is possible to find really good deals on computers that will run Chief well, the lower cost machines often have video, CPU or other hardware that will limit the speed of the program quite severely.
    Doug Park
    Principal Software Architect
    Chief Architect, Inc.

  11. #221
    Join Date
    Aug 1999
    Location
    Ridgway, Colorado, USA
    Posts
    2,917
    Thanks Doug but the issue I have been concerned with is that some, if not many or most, gaming notebooks have an auto turbo-boost feature that apparently does not kick in during normal operations in Chief. As a result, based on a comment by AL, you wind up running at 1.66 Ghz (low end) most of the time and not the 2.8 Ghz (high end).

    I'm thinking that I need to look for notebooks that allow the user to set the CPU speed, when plugged in, and avoid the auto turbo-boost models. Also for one with a cooling system that will handle the load. Still looking.
    Larry

    Lawrence C. Kumpost, Architect

    No matter how much you push the envelope, it'll still be
    stationery.

  12. #222
    Join Date
    Dec 2000
    Location
    Bonaire, Dutch Caribbean
    Posts
    252
    Larry, the following is my observations of CPU speed, using the CPUID application, on both my laptop and desktop PC. See my signature for the two CPU's I'm referencing.

    First I will discuss the Sager laptop. The BIOS does NOT allow over-clocking, nor, if my memory is correct, does it allow control of the Turbo-boost (it is always on). That said, I do believe the algorithm (both BIOS and CPU microcode) that controls the CPU frequency is the one that most people would want and expect. When essentially nothing is being done (like typing this comment), the CPU frequency is at its lowest setting (to conserver power and reduce heat). With X3 running, at the first hint of CPU demand, the CPU frequency goes to max allowed (say roughly 2.8 GHz for my laptop). By monitoring the task manager, I can see that X3 will typically have 1 or 2 CPU's going if there is demand, like zooming in plan view, etc. From my observations, most of the time when there is CPU demand in X3, your laptop will be running at max CPU frequency (what triggers this, I do not know, but it happens). For the laptop, the only scenario that forces the CPU clock frequency to be 1.7 GHz, or so, is when you are doing a raytrace. In this case, all the CPUs are working and for mobile CPUs, the algorithm is to run at non-turbo frequency to keep the CPU within the specified power dissipation spec. I doubt you will find any MOBILE CPU from Intel that operates differently. To restate the above, I believe it is incorrect (based on my observations) to say that for most of X3 operations the laptop will operate at the low CPU frequencies, from my experience, just the oposite is true.

    Now, on the desktop, the CPU frequency algorithm works a little differently because the CPU has a much higher TPD (total power dissipation) specification. The i7-820QM CPU in my laptop has a TDP of 45 watts. The i7-920 has a TDP of 130 watts! So, on the desktop, the idle frequency algorithm is still in operation, and, if X3 demands some performance for a couple of cores, the CPU frequency goes max, just like in the laptop. The difference is that when doing a raytrace, the CPU frequency STAYS at MAX for all the cores. This of course, creates a lot of heat and uses a lot of power, something most laptops are not designed to handle.

    IF you want this type of performance in a laptop, Sager has a desktop replacement laptop that uses the i7-920 (maybe i7-930, or even i7-9xx extreme CPUs by now). I don't have any idea if the bios allows for over-clocking, but I expect that the CPU frequency will NOT drop back when all the cores are working because this is a desktop CPU and most of the power controlling code is part of the microcode in the CPU. This is something you should confirm with Sager Tech support. Also, given how loud the fans are on my Sager, I expect this desktop replacement laptop will require noise-cancelling headphones .
    Barton

    ====
    Chief Architect X5 Premier Latest, Google SketchUp 8
    PC: OS:Win 8 Pro x64, Intel Core i7 3770K 3.5 GHz on an Asus Sabertooth motherboard, 32 GB RAM, NVIDIA GeForce GTX 460 Graphics card, SSD for boot disk.
    Laptop: OS: Win 8 Pro x64, HP dv7tQuadEdition, Core(TM) i7-2670QM - 2.2 GHz, 8 GB RAM, 2GB AMD Radeon(TM) HD 7690M GDDR5, 660GB Dual Drive (160GB SSD/500GB 7200 rpm)

  13. #223
    Join Date
    Dec 2000
    Location
    Bonaire, Dutch Caribbean
    Posts
    252
    I guess I had a senior moment - most of what I said above, I said in some form on this thread back at the beginning of August .

    There is some confusion, I think, about this "turbo mode" thing because Intel has another CPU clock frequency changing approach that they call "speed stepping". For most of what I discussed above, the CPU clock frequency was changed with the "speed stepping" algorithm. The CPU speed is derived from the "bus speed" and the "bus multiplier". When the CPU is at "idle", the bus multiplier is typically 12 but, on demand for more compute power, the bus multiplier will change to 19. The "turbo mode" changes the multiplier from 19 to 21 (not a huge incremental change in my book). This is why the i7-920 is spec'd as a 2.66 GHz (19 x 140 MHz bus speed) with a turbo mode speed of 2.93 GHz (21 x 140 MHz).

    What is "cute" about the Intel specs is that the i7-820QM is a 1.73 GHz CPU with Turbo Freq of 3.06 GHz. Guess what, a bus speed of 144 Mhz and a multiplier of 12 gets you the 1.73 GHz and a multiplier of 21 get the 3.06 Ghz. For the mobile CPUs Intel chose to publish the "idle" bus speed while for the desktop they chose to use the spec for the "non-turbo" max bus speed. Desktop CPUs "idle" at the same 12x of bus speed that the laptop CPUs idle at, but the desktop CPUs "seem" to be way faster How is that for specsmanship. To be fair, it is the TDP (total dissipated power) that ultimately forces this "specmanship" as the i7-820QM laptop CPU, with all CPUs operating can only run at 1.7 GHz to stay within the 45 W TDP while the i7-920 desktop CPU can run at the 19 multiplier because it has the 130 W TDP spec.
    Last edited by Barton Brown; 09-11-2010 at 02:23 PM.
    Barton

    ====
    Chief Architect X5 Premier Latest, Google SketchUp 8
    PC: OS:Win 8 Pro x64, Intel Core i7 3770K 3.5 GHz on an Asus Sabertooth motherboard, 32 GB RAM, NVIDIA GeForce GTX 460 Graphics card, SSD for boot disk.
    Laptop: OS: Win 8 Pro x64, HP dv7tQuadEdition, Core(TM) i7-2670QM - 2.2 GHz, 8 GB RAM, 2GB AMD Radeon(TM) HD 7690M GDDR5, 660GB Dual Drive (160GB SSD/500GB 7200 rpm)

  14. #224
    Join Date
    Aug 1999
    Location
    Ridgway, Colorado, USA
    Posts
    2,917
    Thanks for the info Barton. I took a quick look at Sager, after your previous posts, and they are certainly in the running.

    While I have had very good service from my old Asus G1 notebook I don't like what I have been reading about how their new 15.6", i7 G series notebooks handle the heat (perhaps I should say how they don't handle the heat very well). They seem to have done a better job with their 17" notebooks but I don't really want one that big. When I use the notebook at home I plug in a 26" monitor and full size keyboard anyway so I prefer the smaller size notebooks with higher resolution.

    Also, given how loud the fans are on my Sager, I expect this desktop replacement laptop will require noise-cancelling headphones .
    I was looking at the option of getting a desktop replacement notebook since both my work station and notebook are due for replacement but....
    Larry

    Lawrence C. Kumpost, Architect

    No matter how much you push the envelope, it'll still be
    stationery.

  15. #225
    Join Date
    Jul 2010
    Posts
    4

    New Asus laptop

    I have also been researching laptops and found out that Asus is taking preorders for a new 15" with the same design as the 17". Google Asus G53JW.

 

 

Tags for this Thread

Posting Permissions

  • Login or Register to post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •