Announcement

Collapse
No announcement yet.

GTX 560 TI vcard good for Revit?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    GTX 560 TI vcard good for Revit?

    Hello everyone!

    Im considering the gtx 560 for revit. Can someone give me some insight on it's overall performance, but most importantly, Are you able to move around a decent size model without the transition being choppy?

    Also, well this might be out of topic, but has anyone tried using google sketchup with the gtx 560 ?

    Thank you very much, any kind of insight is greatly appreciate



    ps. heres the card that im looking at

    http://www.newegg.com/Product/Produc...-565-_-Product


    Thanks!

    #2
    We cant really tell you if it is going to be a good card or not. They only way we could have any knowledge is if someone posted their results already. We have a forum post dedicated to video cards, you can find it here. Revit-Hardware-Video-Graphic-Cards

    However, we have seen some sucess stories about the 400 and 500 series Nvidia cards in the past.

    Please keep us up to date on your purchase and experience with the card you buy. :cheers:
    Last edited by Alex Cunningham; April 18, 2011, 02:24 PM.
    -Alex Cunningham

    Comment


      #3
      Welcome to the forums chaoticfreedom!

      So far, it seems like a very good card. I don't personally have one currently in my possession, but a couple Revit friends do, and I briefly tried it out myself.

      Don't know about SU specifically, but I'd guess that it will work well.

      Whether a new video card eliminated the "choppy" may be just as dependent on what CPU you have (it might be helpful to list the rest of your computer specs when asking a hardware question like that). It’s actually the CPU that is generating the geometry that is seen on your screen – it takes very little effort by the video card to turn that into pixels. However, if you turn on those fancy graphic options like shadows and Realistic Views, the video card will assist in processing those parts of the view.

      :beer:

      p.s. - I see you essentially posted the same question in another thread - in the future, please try to pick one spot to ask your question.

      Comment


        #4
        I picked up this very same card last week. So far it seems very good. But as iru pointed out, whether or not transitions in 3D are smooth or choppy is also dependent on the CPU. I have a Core i7-870 2.93 GHz and a small-medium (50MB) size model spins around pretty smoothly with all the eye candy on.

        Comment


          #5
          Originally posted by iru69 View Post
          Welcome to the forums chaoticfreedom!

          So far, it seems like a very good card. I don't personally have one currently in my possession, but a couple Revit friends do, and I briefly tried it out myself.

          Don't know about SU specifically, but I'd guess that it will work well.

          Whether a new video card eliminated the "choppy" may be just as dependent on what CPU you have (it might be helpful to list the rest of your computer specs when asking a hardware question like that). It’s actually the CPU that is generating the geometry that is seen on your screen – it takes very little effort by the video card to turn that into pixels. However, if you turn on those fancy graphic options like shadows and Realistic Views, the video card will assist in processing those parts of the view.

          :beer:

          p.s. - I see you essentially posted the same question in another thread - in the future, please try to pick one spot to ask your question.



          ^^^^^But with an upgraded processor: AMD Phenom II X6 1090T Six-Core BE (<got it cheap) and 650 PSU
          . has two pci power connecters for a gpu.

          Comment


            #6
            So, you're currently relying on an integrated Radeon 4200? Yep, that's pretty weak. The GTX 560 should be a huge improvement with things like shadows on.

            So, you've upgraded the PSU to 650... just curious, did you do that yourself, or was that an option when you purchased it?

            One other thing to check before purchasing the 560 is that it will fit in your case. Check the measurement specs for the length of the card and check that there's enough room in the case (either by opening it up and measuring it with a tape measure, or sometimes the max. length for a video card is listed in the computer's manual).

            Do you also now have more that 4GB of RAM? Just curious because you can have some performance issues as well if you're low on RAM.

            Best of luck!

            Comment


              #7
              Originally posted by iru69 View Post
              So, you're currently relying on an integrated Radeon 4200? Yep, that's pretty weak. The GTX 560 should be a huge improvement with things like shadows on.

              So, you've upgraded the PSU to 650... just curious, did you do that yourself, or was that an option when you purchased it?

              One other thing to check before purchasing the 560 is that it will fit in your case. Check the measurement specs for the length of the card and check that there's enough room in the case (either by opening it up and measuring it with a tape measure, or sometimes the max. length for a video card is listed in the computer's manual).

              Do you also now have more that 4GB of RAM? Just curious because you can have some performance issues as well if you're low on RAM.

              Best of luck!
              I Upgraded the power supply, the stock 250 was too weak.
              I also already checked the dimensions, and it'll fit with decent air flow. One thing i recommend to new buyers, make yourself a "paper and tape" gpu mark where the fans go, and the connectors, open up your case and see how it fits, that way there's no surprises when you install the real thing, or better yet, fewer surprises lol.
              As of now i only have the 4gb that it came with, what difference, in general, do you think upgrading to the full 8gb make?

              I was also thinking, maybe i'll uninstall the x6 1090t, and bring back the Athlon II X4 640(3.0 GHz) upgrade 4gb more of ram(total 8gb), keep the 650 psu, and get the gtx 560 ti. Just save the x6 1090t for a later build, in the VERY near future, that is more adequate . The thing is that i just need a simple set up that will keep up with me for now, since i have quite a bit of projects to finish, and i could benefit from working at home. Once this workload phase is over, transfer the gpu+power supply to the new build when i have time and resources.
              so what do you think?
              Can revit work well if i go back to the athlon II X4 640 and fully upgrade the ram and gpu? I could give it a go, preserve the x6 1090t, and see if the Athlon II x4 640 runs adequately , if not just go on with the x6?

              Thanks for your time, the feedback is truly helpful

              Comment


                #8
                I also have a question very related. I have an old quad Intel Q6600 that is overclocked from 2.4ghtz to 3.5ghtz (it works like this from 2007). Because of the overclocking i`m limited to 4gb of Ram. The video card is an Nvidia Geforce 8800Gts(EVGA with 320mb Video Ram..). My projects usually have maximum of 50mb(hoses 2-3 levels). Before, i had only 2 small monitors(24(1600x1200)+19(1280x1026)). When i orbited around things in a shaded view with Ambient Oclusion it was like 5 frames per second. The problem is that some time ago, I replaced the 24 monitor with a 30 monitor(2560x1600). When orbiting I have 1 frame per second (supposing revit is maximized on the 30 monitor).

                The question: If I replace the Geforce 8800Gts (320mb) with a 560TI (2Gb) it will be better when orbiting? Is the processor affected by the resolution?

                p.s. The mainboard is an Asus P5k. It suports the new 560 Ti....at least a lot of users upgraded on this mainboard an old videocard to the new 560TI.
                Last edited by gaby424; April 27, 2011, 07:32 AM.

                Comment


                  #9
                  Is the processor affected by the resolution? In orbit mode? Should I start a new thread? Iru, where are you ?

                  Comment


                    #10
                    Well, I'm hesitant because I can't say that I know for sure... I was going to do a little "testing" when I had some spare time just to see if I saw the same results.

                    The CPU has to process the geometry that is displayed when orbiting - every time the model "moves" on the screen, the CPU calculates the geometry and sends that along to the video card to be rasterized into pixels that show up on your screen.

                    Using eye candy like shadows, ambient lighting, realistic view (textures), etc., resolution would be very likely to affect performance (just like in a video game). At 2560x1600, the GPU is post-processing twice as many pixels as at 1600x1200.

                    Looking at the 8800 GTS, though outdated, it's still not too shabby, but it's possible between the (only) 320MB of memory and the increase in pixels of the bigger display, you would see a noticeable drop in performance... what makes me hesitate is that I wouldn't expect a performance drop of the magnitude you're observing.

                    I don't want to recommend that you go out and spend $250 on a new video card on the chance that it helps the issue you're having. However, as long as you can take the new video card with you when you upgrade your CPU/MB, then I think it's worth a try (and whether or not you achieve the prior performance level, you should see at least some performance improvements just based on the fact that it's a faster card). Just make sure your current PSU can handle it and that whatever you get fits in your current case.

                    One caveat regarding Ambient lighting...

                    :beer:

                    Comment

                    Related Topics

                    Collapse

                    Working...
                    X
                    😀
                    🥰
                    🤢
                    😎
                    😡
                    👍
                    👎