Page 13 of 25 FirstFirst ... 3111213141523 ... LastLast
Results 181 to 195 of 432

Hybrid View

  1. #1
    Join Date
    Sep 2010
    Posts
    1
    when I open the trial version of x3?- I get the message that my video card is unable to give optimum performance- what does this mean-(I'm not very computer savvy-talk baby talk please)

  2. #2
    Join Date
    Nov 2008
    Location
    New Zealand
    Posts
    706
    it basically means the video card needs upgrading to a more powerful one, Are you on an old machine? if so it may be the whole machine needs replacing. check on the website for min spec reqd, then compare to your machine.
    Graeme Taylor

    currently loaded X3.1 & X4.2-64 bit & X5 64 bit
    also used v7 to x12
    AMD Phenom 2 black 980 3.7 GHz quad core
    8GB DDR3-1333 RAM
    NVIDIA GEForce GTX560 1024Mb graphics
    win 7 -64 bit
    2 x 24 inch monitors

  3. #3
    Join Date
    Jan 2000
    Posts
    4,161
    Chief takes advantage of video card hardware to improve the speed of 3D rendering. The technology in video cards is always changing and to get the most speed we take advantage of technology as it becomes available.

    While you will be able to use Chief on that hardware there are certain features that won't be available or will perform much worse than on more modern hardware.

    If you do decide to buy a machine to run Chief look at the gaming machines. While it is possible to find really good deals on computers that will run Chief well, the lower cost machines often have video, CPU or other hardware that will limit the speed of the program quite severely.
    Doug Park
    Principal Software Architect
    Chief Architect, Inc.

  4. #4
    Join Date
    Aug 1999
    Location
    Ridgway, Colorado, USA
    Posts
    2,917
    Thanks Doug but the issue I have been concerned with is that some, if not many or most, gaming notebooks have an auto turbo-boost feature that apparently does not kick in during normal operations in Chief. As a result, based on a comment by AL, you wind up running at 1.66 Ghz (low end) most of the time and not the 2.8 Ghz (high end).

    I'm thinking that I need to look for notebooks that allow the user to set the CPU speed, when plugged in, and avoid the auto turbo-boost models. Also for one with a cooling system that will handle the load. Still looking.
    Larry

    Lawrence C. Kumpost, Architect

    No matter how much you push the envelope, it'll still be
    stationery.

  5. #5
    Join Date
    Dec 2000
    Location
    Bonaire, Dutch Caribbean
    Posts
    252
    Larry, the following is my observations of CPU speed, using the CPUID application, on both my laptop and desktop PC. See my signature for the two CPU's I'm referencing.

    First I will discuss the Sager laptop. The BIOS does NOT allow over-clocking, nor, if my memory is correct, does it allow control of the Turbo-boost (it is always on). That said, I do believe the algorithm (both BIOS and CPU microcode) that controls the CPU frequency is the one that most people would want and expect. When essentially nothing is being done (like typing this comment), the CPU frequency is at its lowest setting (to conserver power and reduce heat). With X3 running, at the first hint of CPU demand, the CPU frequency goes to max allowed (say roughly 2.8 GHz for my laptop). By monitoring the task manager, I can see that X3 will typically have 1 or 2 CPU's going if there is demand, like zooming in plan view, etc. From my observations, most of the time when there is CPU demand in X3, your laptop will be running at max CPU frequency (what triggers this, I do not know, but it happens). For the laptop, the only scenario that forces the CPU clock frequency to be 1.7 GHz, or so, is when you are doing a raytrace. In this case, all the CPUs are working and for mobile CPUs, the algorithm is to run at non-turbo frequency to keep the CPU within the specified power dissipation spec. I doubt you will find any MOBILE CPU from Intel that operates differently. To restate the above, I believe it is incorrect (based on my observations) to say that for most of X3 operations the laptop will operate at the low CPU frequencies, from my experience, just the oposite is true.

    Now, on the desktop, the CPU frequency algorithm works a little differently because the CPU has a much higher TPD (total power dissipation) specification. The i7-820QM CPU in my laptop has a TDP of 45 watts. The i7-920 has a TDP of 130 watts! So, on the desktop, the idle frequency algorithm is still in operation, and, if X3 demands some performance for a couple of cores, the CPU frequency goes max, just like in the laptop. The difference is that when doing a raytrace, the CPU frequency STAYS at MAX for all the cores. This of course, creates a lot of heat and uses a lot of power, something most laptops are not designed to handle.

    IF you want this type of performance in a laptop, Sager has a desktop replacement laptop that uses the i7-920 (maybe i7-930, or even i7-9xx extreme CPUs by now). I don't have any idea if the bios allows for over-clocking, but I expect that the CPU frequency will NOT drop back when all the cores are working because this is a desktop CPU and most of the power controlling code is part of the microcode in the CPU. This is something you should confirm with Sager Tech support. Also, given how loud the fans are on my Sager, I expect this desktop replacement laptop will require noise-cancelling headphones .
    Barton

    ====
    Chief Architect X5 Premier Latest, Google SketchUp 8
    PC: OS:Win 8 Pro x64, Intel Core i7 3770K 3.5 GHz on an Asus Sabertooth motherboard, 32 GB RAM, NVIDIA GeForce GTX 460 Graphics card, SSD for boot disk.
    Laptop: OS: Win 8 Pro x64, HP dv7tQuadEdition, Core(TM) i7-2670QM - 2.2 GHz, 8 GB RAM, 2GB AMD Radeon(TM) HD 7690M GDDR5, 660GB Dual Drive (160GB SSD/500GB 7200 rpm)

  6. #6
    Join Date
    Dec 2000
    Location
    Bonaire, Dutch Caribbean
    Posts
    252
    I guess I had a senior moment - most of what I said above, I said in some form on this thread back at the beginning of August .

    There is some confusion, I think, about this "turbo mode" thing because Intel has another CPU clock frequency changing approach that they call "speed stepping". For most of what I discussed above, the CPU clock frequency was changed with the "speed stepping" algorithm. The CPU speed is derived from the "bus speed" and the "bus multiplier". When the CPU is at "idle", the bus multiplier is typically 12 but, on demand for more compute power, the bus multiplier will change to 19. The "turbo mode" changes the multiplier from 19 to 21 (not a huge incremental change in my book). This is why the i7-920 is spec'd as a 2.66 GHz (19 x 140 MHz bus speed) with a turbo mode speed of 2.93 GHz (21 x 140 MHz).

    What is "cute" about the Intel specs is that the i7-820QM is a 1.73 GHz CPU with Turbo Freq of 3.06 GHz. Guess what, a bus speed of 144 Mhz and a multiplier of 12 gets you the 1.73 GHz and a multiplier of 21 get the 3.06 Ghz. For the mobile CPUs Intel chose to publish the "idle" bus speed while for the desktop they chose to use the spec for the "non-turbo" max bus speed. Desktop CPUs "idle" at the same 12x of bus speed that the laptop CPUs idle at, but the desktop CPUs "seem" to be way faster How is that for specsmanship. To be fair, it is the TDP (total dissipated power) that ultimately forces this "specmanship" as the i7-820QM laptop CPU, with all CPUs operating can only run at 1.7 GHz to stay within the 45 W TDP while the i7-920 desktop CPU can run at the 19 multiplier because it has the 130 W TDP spec.
    Last edited by Barton Brown; 09-11-2010 at 02:23 PM.
    Barton

    ====
    Chief Architect X5 Premier Latest, Google SketchUp 8
    PC: OS:Win 8 Pro x64, Intel Core i7 3770K 3.5 GHz on an Asus Sabertooth motherboard, 32 GB RAM, NVIDIA GeForce GTX 460 Graphics card, SSD for boot disk.
    Laptop: OS: Win 8 Pro x64, HP dv7tQuadEdition, Core(TM) i7-2670QM - 2.2 GHz, 8 GB RAM, 2GB AMD Radeon(TM) HD 7690M GDDR5, 660GB Dual Drive (160GB SSD/500GB 7200 rpm)

  7. #7
    Join Date
    Aug 1999
    Location
    Ridgway, Colorado, USA
    Posts
    2,917
    Thanks for the info Barton. I took a quick look at Sager, after your previous posts, and they are certainly in the running.

    While I have had very good service from my old Asus G1 notebook I don't like what I have been reading about how their new 15.6", i7 G series notebooks handle the heat (perhaps I should say how they don't handle the heat very well). They seem to have done a better job with their 17" notebooks but I don't really want one that big. When I use the notebook at home I plug in a 26" monitor and full size keyboard anyway so I prefer the smaller size notebooks with higher resolution.

    Also, given how loud the fans are on my Sager, I expect this desktop replacement laptop will require noise-cancelling headphones .
    I was looking at the option of getting a desktop replacement notebook since both my work station and notebook are due for replacement but....
    Larry

    Lawrence C. Kumpost, Architect

    No matter how much you push the envelope, it'll still be
    stationery.

  8. #8
    Join Date
    Jul 2010
    Posts
    4

    New Asus laptop

    I have also been researching laptops and found out that Asus is taking preorders for a new 15" with the same design as the 17". Google Asus G53JW.

  9. #9
    Join Date
    Aug 1999
    Location
    Ridgway, Colorado, USA
    Posts
    2,917
    Quote Originally Posted by vintage58 View Post
    I have also been researching laptops and found out that Asus is taking preorders for a new 15" with the same design as the 17". Google Asus G53JW.
    Thanks, that is good to know. The fact that I can still run X3 on both my work station and notebook means I don't really need to get one right away and Asus does seem to provide a good "bang for the buck", normally. There may be another reason to wait.

    While I will be the first to admit that I am not a hardware expert, I did a little reading about 32nm vs. 45 nm CPU's. This may be an over simplification but the 32 nm, in general, will run faster at lower power levels and less heat. From what I can see right now all of the quad core mobile CPU's are 45 nm. Don't know how long it will be before Intel comes out with a 32nm mobile quad core but it might be worth waiting if you don't really "need" a replacement right now. Anyone know anything more about that?

    As a side note, there are i7 CPU's that are dual core. Up until yesterday I thought all the i7's were quads. Just one more thing to be aware of. Is it just me or is it getting harder to figure out what computer to buy these days?
    Larry

    Lawrence C. Kumpost, Architect

    No matter how much you push the envelope, it'll still be
    stationery.

  10. #10
    Join Date
    Jul 2010
    Posts
    4

    Intel new cpu in the spring


  11. #11
    Join Date
    Jan 2000
    Posts
    4,161
    We don't do anything that prevents it's use, but performance may not be as fast as you expect. SLI and cross fire allow multiple video cards to share the rendering load. We have not specifically optimized for these cards so I doubt that you will get as much benefit from using them as is theoretically possible.
    Doug Park
    Principal Software Architect
    Chief Architect, Inc.

  12. #12
    marty is offline Registered User Promoted
    Join Date
    Sep 1999
    Location
    Auckland New Zealand
    Posts
    1,310
    I have recently bought an Asus K52 J laptop.

    Model names and numbers are different all over so its hard to compare but it is an i5 Processor : 2.53 GHz - 2.26 GHz, with Turbo Boost up to 3.06/2.93/2.66/2.53 GHz; 4GB RAM, ATI HD 5470 1GB RAM.

    The 15.6 led display is extremely crisp but I would rather not have the high gloss screen which all laptops seem to have these days. It runs a lot quieter than my previous Toshiba and doesnt seem to get too hot. Chief runs fine although I am not using at as my main machine.
    Gordon Martinsen
    Auckland
    New Zealand
    W7 64 bit X5
    i7 2600k 3.7Ghz
    8 GB RAM
    180Gb SSD
    Nvidia GTX 560 1 Gb

  13. #13
    Join Date
    Mar 2010
    Posts
    51
    Can anybody tell me if there's a considerable performance difference in X3 between a Video Card with 512Mb Ram and a 1Gig card?
    An a side note, it would really help to have a general hardware guidance white paper or a forum of systems/components that clearly list systems/components that work and components people are having trouble with.

  14. #14
    Join Date
    Jan 2000
    Posts
    4,161
    Whether there is a significant difference in performance depends on how big your model is. If you don't use the whole 512MB of Ram then no. If you go over it then yes.

    Listing components that work is hard in that almost everything works pretty well. Listing things that don't work is also a problem in that sometimes drivers change to either fix or break video cards with respect to Chief. For example Intel was having a lot of trouble with some of their integrated graphics chips for quite awhile. Earlier this year they released drivers that fixed the issues that had been observed.

    The other kicker is that almost every graphics card has a quirk here or there that can cause issues.

    In addition sometimes reports of issues are due to hardware failure, improper driver installation, or old drivers so it is sometimes hard to tell if the reports of issues are accurate or not.

    We simply don't have the resources to test all combinations of video cards and hardware internally, and I wouldn't trust a customer run reporting mechanism because of variability in how the testing would be done.

    One other issue is the rapidly changing technology. For example when the i7s first came out they didn't have slow versions of them, now they do. So at first saying get an i7 was good advice. Now you have to look at the model to see what features it has in order to make a good judgement on whether it's a fast one or a slower one.

    As far as guidance goes. Read this thread. Read reviews on hardware from other sources. For example if someone reports that a video card is slow I usually go looking for a review and almost without exception I find that reviews show that the card is slow relative to other cards.
    Doug Park
    Principal Software Architect
    Chief Architect, Inc.

  15. #15
    Join Date
    Jan 2009
    Posts
    36
    I am looking at a new laptop

    Toshiba Satellite P500
    18.4” diagonal widescreen TruBrite® TFT LCD display at 1680 x 945
    native resolution (HD+)
    Windows® 7 Professional 64-bit
    i7-740QM processor 1.73GHz (2.93GHz with Turbo Boost Technology), 6MB L3
    Cache
    NVIDIA® GeForce® GT 330M with 1GB GDDR3 discrete graphics
    8GB (4GB x 2) DDR3 1066MHz
    500GB (7200 RPM) Serial ATA hard disk drive

    Is this setup looking good for X2 or greater in the future ? Graphics card ok ?

    It seems from some posts I may not see the speed upgrade from duo cores that I might hope for ?

    I have found mostly positive feeback on Toshiba but anyone else have experience with them ? The Sagers and Asus seem to be extremely expensive and too many option on the sager for me to customize.

    BTW, price quote this week for this setup with full glass display( no bezel) backlit keyboard, Office Professional, 12 cell battery, Dyna Dock docking port is $2003.00

    Seemed like a good deal compared to Dells, HP, Sager, IBM, Sony etc

 

 

Tags for this Thread

Posting Permissions

  • Login or Register to post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •