Page 1 of 10 12345 ... LastLast
Results 1 to 10 of 94
Like Tree2Likes

Thread: RFO Benchmark v2

  1. #1
    Administrator Gordon Price's Avatar
    Join Date
    December 7, 2010
    Location
    Rotterdam, South Holland
    Posts
    2,809
    Current Local Time
    04:24 PM

    RFO Benchmark v2

    We are pleased to (finally) release RFO Benchmark v2!

    Please post your results in
    RFO Benchmark v2 2016 results
    RFO Benchmark v2 2015 results
    RFO Benchmark v2 2014 results
    RFO Benchmark v2 Virtualization results

    This new test is a little easier to work with, simply pick the "Set" you want to run, as defined by the shortcuts in the root Benchmark folder, and double click. Your results file will end up here as well.

    There are three "sets" be default.
    • Standard: (best for posting on RFO to make comparisons) This set includes the same benchmark modules as the old test, but it runs three times and averages the results.
      Note: The journals here are similar to the old test, but not identical, so results should NOT be compared with results from the old test.
    • Expanded: (best for testing to make purchase decisions) This set adds new benchmarks, expands on existing benchmarks and builds a "heavier" model to more realistically test hardware. It also runs 5 times, throws out high and low outliers for each benchmark and averages the rest.
      Note: This test can take a LONG time to complete, even for you speed demons getting double digit results on the old test.
    • Simplified: (best for a quick benchmark) This set does away with the Render benchmark and the non Hardware Accelerated Graphics benchmark, and only runs once. It also provides an example of the "Messages" functionality, which displays which benchmark module is being run. This can be helpful when also monitoring CPU, GPU or RAM utilization, to see where your bottlenecks are.


    A few things to be aware of
    • All options are defined as arguments in the shortcut. Options include:
      -testSet:??? controls which set, as defined in the XML file, is run.
      -runCount:# sets the number of times the set should be repeated.
      -csv produces a Comma Separated Values file of the raw data, as well as the regular "Results" file formatted for reading.
      -messages produces a windows popup message as each benchmark module starts.
    • When runCount is between 2 & 4, results are averaged. When runCount is 5 or higher, high and low outliers are dropped and remaining data averaged. All raw data is tabulated at the bottom of the Results file, as well as in the CSV file. More than 5 runs is probably extraneous.
    • Sets and Benchmarks are defined in the XML, along with behaviors like notes, grouping and summing, Revit INI modifications, etc. This makes the tool much more modular, so you can use it as a jig for creating your own benchmarks.
    • Thanks to some great input from the Factory, the Journals have been modified to eliminate some potential misbehavior (the PermissiveJournal debug mode), address some benchmarks that were either not reporting the full time to complete or where reporting extraneous time, as well as to make maintenance easier moving forward. This does, however, mean the results can not be compared with results from the earlier test.
    • A couple of interesting techniques are used in the Journal files, for you Journal nerds, including...
      VBScript Conditionals: Journals allow for VBScript, which can include conditional journal blocks. The Model_creation journal has an example of this related to Revit Structure.
      Disarticulated Journals: A single recorded journal can be broken apart, Save and Open sections added along with the data needed to reestablish the processing environment, and a much more modular journal structure results. An example of this can be seen in the New_file and Model_creation journals, which used to be a single Journal file.
      Optional Journals: Using Disarticulated journals, an optional journal can be inserted into a benchmark sequence. An example of this can be seen in the Array_link journal used in the Expanded test.
      I hope this techniques prove useful for folks looking to expand on their leveraging of Journal files. And again, big thanks to the Factory for cluing me in on these possibilities.
    • If you have any problems, please report them in this thread so the test can be revised and a new build issued.



    Please keep all discussions about the tools in this thread, and use the Result threads ONLY for results. One last note. Some folks have had issues with the downloads being corrupted. If you have this issue, I recommend you try the download with a browser that is NOT Internet Explorer. Maybe IE9 is better, but even IE8 sometimes results in corrupted download. Chrome and Safari for Windows have both worked well for me, and others have reported good results with Firefox.Thanks all, I hope you find RFOBenchmark useful.

    If you are looking for the old Benchmark and results it can be found here.

    Thanks!
    Gordon


    NOTE: Downloads updated to version 2.1, which addresses the European decimal bug, amongst other things. Comparisons between test run on v2.0 and v2.1 are valid comparisons.
    Attached Files Attached Files

  2. #2
    Junior Member
    Join Date
    April 12, 2013
    Posts
    9
    Current Local Time
    03:24 PM
    Is there a separate visualization RFO download or are they built within the downloads already indicated?

  3. #3
    Administrator Gordon Price's Avatar
    Join Date
    December 7, 2010
    Location
    Rotterdam, South Holland
    Posts
    2,809
    Current Local Time
    04:24 PM
    Quote Originally Posted by brencass View Post
    Is there a separate visualization RFO download or are they built within the downloads already indicated?
    The Benchmark download can be used for virtualization testing or RM (Real Machine) testing. It is just a script that is run on a machine, virtual or not, that then runs Revit with journal files.
    The results for virtualization are broken out because results can be so dramatically different from RM results. And because we had a dedicated thread for the previous test.

  4. #4
    Junior Member
    Join Date
    April 12, 2013
    Posts
    9
    Current Local Time
    03:24 PM
    Noticed a issue with the script i have posted about it within my revit 2015 benchmark post but thought i best post it here.

    The script is indicating the graphics card i have has only got 4gb of ram when it actually has 6gb dedicated. it would also be good if it could pull the hard drive type because a ssd will effect performance from a read/write point of view compared to a standard HDD.

    I have noticed it in another persons post on the 2016 post that they have a similar card to me and is indicating 4gb when it should be 6gb as the 980ti cards standard ram config is 6gb. GeForce GTX 980 Ti | Specifications | GeForce

  5. #5
    Administrator Gordon Price's Avatar
    Join Date
    December 7, 2010
    Location
    Rotterdam, South Holland
    Posts
    2,809
    Current Local Time
    04:24 PM
    Brencass, all the data in the Machine Spec section is pulled directly from WMI (Windows Management Instrumentation) and unfortunately, when the card is reporting the wrong RAM it is because the card is reporting the wrong data to Windows. But, your point about SSDs is well taken. I'll see if there is any way to extract that. My concern is there may not be a easy way to correlate where the Benchmark is run from against what the drive is. For example, WMI will report that the boot drive is an SSD, but the benchmark could be being run from a network share. I'll have to explore how to ensure that any reported data is in fact meaningful data. But, for what it is worth, an SSD hardly has any real impact on the benchmark. It makes launching Revit faster, and it makes opening and saving local files faster. But the other 99% of the time it makes little difference at all. Speeding up launching of Revit is of real benefit to users, but due to the nature of Journal files there is no easy way to get that into the benchmark. Though it may be something to explore for next year. Always something to explore for next year.

  6. #6
    The Moderator with No Imagination MPwuzhere's Avatar
    Join Date
    December 14, 2010
    Location
    Coeur d Alene, ID
    Posts
    3,893
    Current Local Time
    07:24 AM
    Going to be interesting to see how the "Expanded" tests turn out. My computer froze for I don't know how long when I had an appointment. But it pretty much took half a day to run. Luckily it was my home computer, not my work laptop.

  7. #7
    Administrator Gordon Price's Avatar
    Join Date
    December 7, 2010
    Location
    Rotterdam, South Holland
    Posts
    2,809
    Current Local Time
    04:24 PM
    Yeah, Expanded is a beast. It adds an Update benchmark, plus the Link and Array, and the entire Model creation runs with the 24 extra copies of the linked model in place, Export does PNGs at 300 dpi instead of 150 and exports all the views to DWG instead of just some, it renders using both render engines, and does the graphics test with the 24 links in place also. And then (in that great Robin Williams on Golf voice) it does it "four more %@$#ing times!" My advice, run it over night the first time, just to be safe. I should probably add that to the readme.
    That said, you should have seen my MBP nearly melt down when I Automated all three Sets, for all four verticals, for all three years. It basically just ran for a day and a half. Thankfully with a VM I was able to still do some work while it ran in the background. If I had left it alone it may have finished in only 24 hours.

  8. #8
    Junior Member
    Join Date
    August 10, 2012
    Posts
    8
    Current Local Time
    04:24 PM
    I still can't get the Benchmarks to work.

    In Revit 2015 I get a Journal Error, and in Revit 2016 the results are like this:

    RFO Benchmark v2 - RVT 2016 'Simplified' Test Set 2015.09.25 @ 14.31.01 on CADTEST128
    build: 22.09.2015
    RevitForum.org

    All times are in seconds, lower is better.

    Model creation and view export benchmark
    0.00 opening and loading the custom template
    0.00 creating the floors levels and grids
    0.00 creating a group of walls and doors
    0.00 modifying the group by adding a curtain wall
    0.00 creating the exterior curtain wall
    0.00 creating the sections
    0.00 changing the curtain wall panel type
    0.00 export all views as PNGs at 150 dpi
    0.00 export some views as DWGs
    0.00 TOTAL

    v2 Simplified Graphics benchmark [1]
    0.00 refresh Hidden Line view x12 - with hardware acceleration
    0.00 refresh Consistent Colors view x12 - with hardware acceleration
    0.00 refresh Realistic view x12 - with hardware acceleration
    0.00 rotate view x1 - with hardware acceleration

    [1] Graphics benchmark measures the entire graphics stack, which includes CPU and memory.
    To meaningfully compare graphics cards, test all cards in the same machine
    and use the 'v2 Graphics - expanded' benchmark.


    MACHINE SPEC:
    Mfr: LENOVO
    Model: 30AGS00K00
    OS: Microsoft Windows 7 Professional 64-bit

    CPU: Intel(R) Xeon(R) CPU E3-1271 v3 @ 3.60GHz
    Max Clock Speed: 3.601Ghz
    Number of Processors: 4
    Number of Logical Processors: 8

    Total Physical Memory: 32GB @ Mhz


    Graphics Card: NVIDIA Quadro K2200
    Graphics RAM: 3.9990234375GB
    Driver version: 9.18.13.4803


    Screen Resolution: 1680x1050x32bit @ 59Hz
    Benchmark did run full screen

  9. #9
    Administrator Gordon Price's Avatar
    Join Date
    December 7, 2010
    Location
    Rotterdam, South Holland
    Posts
    2,809
    Current Local Time
    04:24 PM
    Morten, any chance you might be available for a GoToMeeting, over the weekend or early next week? I would like to trouble shoot this with you and see if we can't figure out what is going on.

    Thanks!
    Gordon

  10. #10
    Junior Member
    Join Date
    August 10, 2012
    Posts
    8
    Current Local Time
    04:24 PM
    Gordon, Yes' I'm available this week.
    /Morten

Page 1 of 10 12345 ... LastLast

Similar Threads

  1. RFO Benchmark decision thresholds
    By Gordon Price in forum Hardware and Infrastructure
    Replies: 3
    Last Post: August 2nd, 2015, 07:36 PM
  2. Can't get 2013 Benchmark running
    By brett05 in forum Hardware and Infrastructure
    Replies: 3
    Last Post: December 11th, 2013, 08:48 PM
  3. RFO 3DMax Rendering Benchmark
    By gaby424 in forum Hardware and Infrastructure
    Replies: 4
    Last Post: February 7th, 2012, 12:25 AM
  4. CPU benchmark 3Ddmax Mental Ray
    By gaby424 in forum Hardware and Infrastructure
    Replies: 3
    Last Post: May 14th, 2011, 07:37 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •