Donate Now Goal amount for this year: 3500 USD, Received: 662 USD (19%)
Please make a donation to help us pay the hosting bill. Read more about donations in this thread. And please note that all donations are voluntary and anonymous. Thank you!

Results 1 to 3 of 3

Thread: Atomising Data

  1.    #1
    Moderator snowyweston's Avatar
    Join Date
    December 21, 2010
    Location
    C.LONDON
    Posts
    3,548
    Current Local Time
    03:30 PM

    Atomising Data

    Probably completely the wrong term, but I like the word, so I'm sticking with "atomising". Anyways...

    Playing a bit with PowerBI today I've been noting holes* in my data and been making a mental note of what to consider for my next round of Template development.

    *holes as in not-missing information not-coming out of Revit, but information I could really benefit from if it were coming from Revit


    Specifically; when dealing with large-site concept-stage work, the ability to "drill" up, down and all around, all the while querying, a dataset is increasingly attractive. The "issue" (if you could call it one) is how one might populate said properties, and into which, and how many parameters, for best effect.

    A lot of this thinking (read: rambling) stems from my love of "reverse-order" and hierarchal systems; where I almost always wish to seperate information, to the point where nothing is (effectively) a "concatenated" (compound?) field of information, i.e.:

    <NameFull> = "Mr Snowy Weston"
    is actually: (<Title>," ",<NameFirst>," ",<NameSecond>)
    where one might use some tricksy little Dynamo to concatenate the bit-parts into said <NameFull> parameter.


    Elevated to the "large-site concept-stage work" scenario; doing such requires a (very) top-level-down approach to the notion of taxonomy, i.e.:

    >Job
    >>Site
    >>>Plot
    >>>>Block
    >>>>>Building
    >>>>>>Floor
    >>>>>>>Zone
    >>>>>>>>Tenure
    >>>>>>>>>Tenure (Sub-Type)
    >>>>>>>>>>Department
    >>>>>>>>>>>Area
    >>>>>>>>>>>>Room/Space
    >>>>>>>>>>>>>System
    >>>>>>>>>>>>>>Element
    >>>>>>>>>>>>>>>Sub-Element
    >>>>>>>>>>>>>>>>Part

    (this is my current, off-the-cuff, list - and I know there are "true" organisational schema for this kind of stuff, but hear me out)


    So I've started this thread to "specifically" target the sense in persuing the Nth degree, i.e.:

    Take UK Building Use Classes for example.

    Many would, and quite fairly, create a Shared Paramter <Use Class>, populate it "A1", "B2" or "C3(b)" and be done with it.

    Me? I'm presently toying with having:

    <UseClassPart> = "C"
    <UseClassType> = "3"
    <UseClassVar> = "b"
    Then having Dynamo populate:
    <UseClass> = as a concatenation of the above (seperate) fields.

    Why?

    Because I like to have things "atomised" in Revit to allow different levels of data-management and visualisation.

    Now (of course) I know we can define view and schedule filters using search terms "Starts with"... but downstream, say in something like PowerBI, Access, or more commonly in Excel, I tend to find it far more helpful having the information as seperate, "nuclear" columnlar data than having to setup multiple COUNTIFs and SUMIFs. Considering also, that certain/many classification "systems" (the UK Building Use Class coding example included) do not follow a logical/consistent/alphanumeric taxonmy, making rule-based queries (of compound fields) becomes increasingly difficult.

    So am I crazy? Do any of you have a similar masochistic tendancy to drive things down to the itty bitty as well? I'm really curious if I'm in my own little world here.

  2.    #2
    Forum Addict GMcDowellJr's Avatar
    Join Date
    December 21, 2010
    Location
    Phoenix, AZ
    Posts
    1,752
    Current Local Time
    07:30 AM
    I don't know the specific example, but can you go the other way? Enter it as UseClass and have Dynamo tear it apart to populate Part/Type/Var?

    This is a typical issue databases face. If you don't, or aren't able, to get the data separated to the granularity you need to run reports, it doesn't do you any good. On the other hand, it's a lot easier for a user to enter in data in natural language.

    I would say that, if you're doing this for you and aren't expecting others to enter the data on the fly in the same way, then you're on the right track. But, if this is an example of how you'd like others to enter the data, you're going to get a fair amount of errors in your data stream.\

    You can estimate the severity of the errors using Little's Law (rate of errors entering the system x length of time under consideration = estimated number of errors in the system -- basically RxT=I) If you can live with the errors or repair them on the fly then it can work. Otherwise, someone, you, has to repair the data manually.

  3.    #3
    Member
    Join Date
    March 21, 2013
    Location
    Phoenix, AZ
    Posts
    120
    Current Local Time
    07:30 AM
    Quote Originally Posted by snowyweston View Post
    Because I like to have things "atomised" in Revit to allow different levels of data-management and visualisation.
    OK, but does it have to be IN REVIT initially? Dynamo has the ability to read Excel files; would you really need the data duplicated in the model and an Excel file? In other words, it may be more effective to query the Building Use from an Excel file than from the Revit model.

    Quote Originally Posted by snowyweston View Post
    but downstream, say in something like PowerBI, Access, or more commonly in Excel,
    I don't want to mis-speak, but I think this is what Autodesk's "Forge" concept is about. The API has access to the model data, and developers can build a platform to do what you're describing - a client to access the model, pull data from it, and make it accessible to other programs (or your own web interface). In your case, you might want your "data visualization" program to also reference an external Excel or Access file.

    Also note that Dynamo expects to be running continuously on a Revit model. I've found it difficult to automate things that I want to happen only once (delete these notes and replace with these).

    Quote Originally Posted by snowyweston View Post
    Considering also, that certain/many classification "systems" (the UK Building Use Class coding example included) do not follow a logical/consistent/alphanumeric taxonmy, ...
    For those situations, I replicate the conditions of the table as closely as possible. These tables and equations are often buried into annotation objects, which allows for Type Catalogs to "prime" some of the fields. For example, tables that cross-reference Occupancy and Building Type (IBC 503) would need a Type for each combination. That said, prescriptive tables are easy to replicate with programming (wouldn't need all the Types in Revit). Added bonus - most of my complicated Type Catalogs start from .xlsx files anyway.

    I'm currently having fun making Dynamo use the API to "get around the UI" and get things done in Revit. Bulleted lists are fine, but now I can use Excel to select, re-order, and edit a master list of notes to use on multiple projects. Dynamo reads formatting from Excel and uses the API to get around the UI to set list formatting directly. So I'm only using Revit components to display the information (not store it) - Excel and Dynamo control the content.

Similar Threads

  1. RTC: Data Challenges
    By RTC in forum Blog Feeds
    Replies: 0
    Last Post: July 6th, 2016, 01:15 AM
  2. Data Base - Input Data
    By Lorraine in forum MEP - General
    Replies: 4
    Last Post: June 15th, 2016, 12:27 AM
  3. RTC: How data-savvy are you?
    By RTC in forum Blog Feeds
    Replies: 0
    Last Post: April 11th, 2016, 11:45 PM
  4. Replies: 0
    Last Post: March 15th, 2016, 12:15 AM
  5. Don't Think: Data mining
    By Dont Think in forum Blog Feeds
    Replies: 0
    Last Post: June 29th, 2012, 07:15 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •