Tuesday, June 19, 2012


I feel misled by my media upbringing. It's 2012 and I thought we were promised weapon-ized lasers hijacked by Doogie-Howser-style geniuses to pop popcorn and destroy a house. In stead we use lasers to scan millions of points and stitch together impossibly intricate point clouds. Gathering current condition 3D models to an unparalleled degree. At the same time Building Information Modeling software technology has matured to include facilities models delivered for new construction and renovation work.

High Definition 3D Laser Scanning. Tools have emerged that allow us to visualize and vectorize (I added that word to my dictionary) a point cloud in Revit. I think there is a lot of confusion regarding this technology train wreck and I thought I would throw some gasoline on the fire.

To scan is not to model; those are 2 extremely different plateaus of work. Many scanning providers are looking to expand their services by delivering not only the scan, but the delivery of a model and some cases an intelligent (read: Revit) model. In an effort to ramble as little as possible, here are a few points and technology reviews about scanning and BIM.

Point 1: A scan, all by itself, can be of great value for capturing existing conditions for coordination and communication purposes. If there is not good reason to add the complexity of a model, don’t.

Point 2: Users modeling in Revit from a scan have to be very good with Revit.

Point 3: There is NO one tool that delivers the ability to visualize and vectorize a point cloud into a building information model, but a series of tools that have unique functionality.

Point 4: Point clouds are really complex and not right for many situations, this is not an area of a project you want to square-peg- round-hole it. Get it wrong and you'll way over-promise.

Autodesk's PCG file format - Not enough visual options, can snap to points as you draw in Revit

Leica Cloudworx for Revit - Uses IMP file to store and manage database. Lots of visualization options, managing clipping plane functionality is awesome and makes this an essential tool for working with point clouds of any format
(as long as you can get it into Leica's database). Pipe tools are really fast and accurate.

IMAGINiT Scan to BIM - Wall Region Grow anyone? Its awesome, and nothing touches it from an architectural modeling standpoint. From an MEP side, round and rectangular duct. Oh, and did I mention that it works on Leica point cloud databases? As essential as a modeling tool as Cloudworx is essential a visualization tool.

Autodesk Labs Feature Extraction - This one is clearly a technology preview, not done cooking yet. It does work with the Leica point Clouds and has an extremely promising building footprint tool.

At IMAGINiT we have great partnerships with Autodesk and Leica as well as a very talented internal software development team (part of our parent company Rand Worldwide). In other words, We have many more experts and opinions on BIM and 3D HDS, inquire within.

Tuesday, June 5, 2012

The New BIM Population

You might have noticed them, invading your user groups, listening intently at every event, asking different questions, and then they introduce themselves… Facilities groups. Owner representatives with increasing expectations and a pressing need to leverage technology. Sound familiar? Well it should. Facilities use of BIM is a decade behind the AEC industry and long term owners are now becoming aware of BIM and asking for it. Are you ready?

These are usually organizations in flux, expanding or under corporate mandate for a massive technology shift. They find themselves needing to assert themselves where they haven't before, without clear direction as to the ramifications of what they are asking for. 

Usually facilities groups are educated in BIM when new construction projects are in the design/documentation/construction process. Attending a BIM coordination meeting , they see the value of the model and they are hooked. What they miss is all the steps it took to get there. The mechanics of collaboration seem straightforward, just put the models together, coordinate, and build better buildings.

This situation turns the architect/engineer/contractor into a BIM educator when the stakes are high. Oversell and the expanded scope eats your margin up. Undersell and you risk seeming unsophisticated in BIM and losing future work. The right balance of education and added value for you and your clients could spell tangible success. Success that you can both wear proudly, success that shepherds your firm into the next decade of model integration.

Wednesday, May 30, 2012

Be The Smartest Person in The Room

This one is really, really easy. Honestly, I can't believe we ever agreed  as an industry to do this the way almost everyone has been.

When worksharing is enabled and the first save creates your central model many people append _CENTRAL to then end of the file name. So when a local file is created you get: ProjectName_CENTRAL_UNSERNAME

It long, its redundant and the CENTRAL moniker does nothing for you. Instead use that space for something useful, something that provides valuable information about your Revit file. End your file name with a version number: ProjectName_2013_USERNAME

If you are currently appending CENTRAL to your file names, loft up this suggestion and I bet everyone agrees with you. An easy way to be, at least for a moment, the smartest person in the room.

Wednesday, May 16, 2012

Mass Effect

Modeling your project during the conceptual design can drive the most benefit during the "pre-schematic, post-pre-design" stage. Unfortunately this is usually not a time I a project when much attention is being paid to a 3D digital model. When you model conceptually you can gain a  deeper understanding of your project without spending abundant hours developing any single idea. Massing tools are how we model iteratively without weighing our ideas down with detail. 

Not all massing tools are created equal. Sketchup is great for quick modeling and visualization but the data behind the model isn't there and the downstream use of those objects in analysis and documentation in Revit is very limited. There are some tools that allow you to analyze inside of Sketchup, the IES toolkits have a plug-in for Sketchup, but I can't do anything with these models later on. More often than not designers misuse Sketchup by creating detailed forms just to get an image. This wastes hours on a project without creating anything to leverage later on. Models in Rhino usually suffer the same fate.

The massing tools inside of Revit can be used to create a 3D generic form that allow schedulable parameters of surface area, volume, perimeter by floor, and area by floor or building. These can also be used for high quality renderings via FBX export into 3DS Max Design, or as rapid energy models for comparative analysis. Exporting the mass model GbXML file also opens up many other analysis options in outside software. Autodesk Labs' Project Vasari expands on this even more with solar radiation and wind tunnel studies using mass models directly inside of a Revit-style interface.
The beauty of these types of massing studies is that they are quick and provide a lot of good comparative data. The key word there is 'comparative'. Model multiple options, then compare,  to understand and proceed with your design.

I have worked with some firms that take massing to a new level inside of Revit to show programmed areas conceptually by using separate mass objects with individual materials assigned for color differentiation. There is quite a bit of leg work there and if you are need to respond to a tight space program a piece of software like Affinity (from Trelligence) would probably be the best option.

It think a lot of us got caught up in the "low hanging fruit" pitch conceptual modeling allows. That is "low hanging fruit", not "fall into your lap fruit". There is some effort involved. You have to change your design workflow to accommodate for a new contributor, the mass model.

Thursday, May 10, 2012

Habitually Successful

While teaching BIM software during my career with IMAGINiT I have noticed common habits among those that pick up the software quickly and are successful with it.

I am sure I am not the first, nor will I be the last to put a list like this together. This was just something on my mind. Without further ado, in no particular order....

Click every button - When you are learning a new application this lets you see all that is possible and can give you insight as to how object, views, and features relate.

Follow exercises in class and practice outside examples - Throughout the course there are times for you to practice, use this time wisely. Start written exercises promptly, make sure you have some repetition on each part of the exercise, do them again after the class. Draw your house, this will help you ask the questions you don't know to ask until you are in a real project.

Come with curiosity and questions - When you leave the first day of class, look at the buildings you pass, start to think about how that building envelope might be constructed with objects in Revit. If you don't know the answer, ask and I bet everyone will learn something.

Not afraid to click their mouse -  Training classes for the most part are a controlled environment, you aren't going to mess anything up by clicking… a lot. At the same time be mindful of on screen cues.

Excited about the technology change and how it can effect their career - Attitude is 85% of the experience of learning new things, I see student get downtrodden over the smallest inconveniences in the software. Big picture is this is happening and this is the time to learn it.

Punctuality - architects are often late, contractors are often early, <insert another generality here>. Bottom line: If you miss 5 minutes, they might be "the" 5 minutes that clarifies an important aspect of the software.

Understand standard Windows functionality (for example: the ability to find and save something to a specific folder on your C Drive) - This type of training in a class really gums up the works. Comfort using a mouse is a must as well.

Stay adequately caffeinated - This one is not universal. For me though, it helps.

Tuesday, May 8, 2012

User Created Worksets

One of the first things I look at when performing a Revit Health Check for a client is their use of user created worksets. First, is the list very long with some worksets representing information that could possibly be used for downstream use (e.g. Revit schedules, Navisworks search sets, e-SPECS bindings)? Second, are the objects correctly assigned to their respective worksets? 
More often than not, the list of user created worksets is very long and the components are not placed on the correct workset (Thanks to Revit 2012 which makes this process much easier). Worksets should never be used to represent data that needs to be extracted because defining an object's workset is a manual process. The beauty of Revit is that it eliminated the manual process of defining an object's type/layer. Also, the data inside of the Revit components was organized inside of database that was part of every object in Revit. Revit MEP systems are a perfect example of this, as you connect the objects of a specific type (e.g. Hydronic Supply, Return Air) it populates with that data. In essence you couldn't be wrong with Revit.
Worksets should instead be broad categories that more directly relate to job roles. Limit the number of worksets and your users will get it wrong less, allow them to "set it and forget it" (If I may steal a line from the infomercial giant Ronco Inc.). The worksets will be easier to manage, and you can confidently use all the great performance enhancing benefits of user created worksets.
Look for more content on User Created workset best practices for all disciplines on the IMAGINiT Portal and ProductivityNOW.

Thursday, May 3, 2012

Being a Bad Background

In an earlier post I described how architects and MEPF engineers can learn to benefit from a multi-discipline Revit environment by respecting and anticipating the pain points and natural workflows of the other. Now I'd like to talk about structural engineers.

The reliance on linked geometry to host elements really isn't present between Revit Architecture and Revit Structure. In stead there is the issue of redundant modeled geometry and the documentation reliance on structural elements in architecture.

Many firms have worked through the first of the two issues by defining clearly what elements the structural engineer will own and which the architect will. Structure might own the slab, architects might own the floor finishes. Structure might own the roof deck (usually modeled as a floor) and the architect will own the roofing above the deck (usually modeled as a roof). It takes a thorough LOD document, but it can and has been accomplished.

The other issue here is a little more troublesome, an architect's reliance on structural elements to complete certain document deliverables. A lot of architectural firms fake in structure so that they can get documents out the door (e.g. foundation and stoop conditions, trusses, framing). Significant and detrimental time is lost when this is necessary to do but sometimes it is necessary.

Architecturally you have to communicate those shared items that are required and when they are required. Structurally you have to make modeling accommodations for the architectural documents.

I can hear the structural engineers now "easy for him to say". Well it is easy for me to say and it is easy for them to do. May I be the first person to say(although probably not really the first): structural engineers have had the least amount to change and adjust to in a Revit workflow. A little cooperation will go a long way on this one. 

I want to point out that LOD really solves both of these issues outlined above, but it doesn't have to be the AIA e202 document. Think about a collaborative requirement, think about a usable "desktop" standard, think about a logical and timesaving document that might just save your profit and really set you apart. Ok, horse officially beaten.

Tuesday, May 1, 2012

Being a Bad Host

I remember when I first starting using Revit Architecture, we were modeling key structural elements ourselves and using linked CAD files from MEP to complete our RCPs. I would bug my reseller at every turn to help me understand how the disciplines would work together without answer. Then an "Autodesk guy", as he was referred to in later conversation, told our Revit users group that Revit MEP wasn't ready yet and that no one should use it. They of course continued to sell it.
Many years later I am 3.5 years into my use of Revit MEP and have had the pleasure to see how many different firms are exploiting or suffering through a multi-discipline Revit workflow. Revit MEP and Revit Architecture's problems seems to be with the high level of dependency MEP objects have on host Architectural faces.

It is very easy for an architect to delete and recreate geometry that is, unapparent to them, the host of an MEP object. This destroy's the "warm and fuzzy" feeling Revit gives us about elements staying spatially coordinated.  In fact it can be a nightmare even if the element remains hosted. For example, sometimes a ceiling might move for some design purpose. If elements are hosted to those that have hard connections (e.g. ducts and air terminals) the architect runs the risk of destroying duct networks that don't have the space to adjust.

2 strategies need to be undertaken to make sure this works better:

First, Communicate design changes out side of Revit. This one is an age old problem between architects and MEPF consultants. You either need to setup a brute force way of communicating (email report of model changes by room upon receipt of a new model) or a software centric comparison automation (in Navisworks or the Compare Models Revit extension).

The most important change that needs to happen is both parties understanding the limitations, implications, and realities of a coordinated workflow in Revit.

Second, MEPF engineers need to respect and anticipate how the architect's model will change. MEPF engineers can host elements on reference planes where hosted and orphaned elements are sensitive to change. The model adjustment will be manual but many hosted elements can be changed at once. Or maybe no elements should be hosted at all. I ran an unconference session at this past AU called "Leveraging and Architect's Model in Revit MEP" and this was the sentiment of the group that attended. Those that had once hosted on planes no longer do so. A vast majority actually have a standard for non hosted components for everything.

Bottom line, this is a two way street. Architects need better ways to communicate changes per model update, and MEPF engineers need to host (or not host)objects in a way that protects them against damage caused by changes out of their control.

Thursday, April 26, 2012

Moving the Sustainability Frontier: The Macleamy Cruve

Many architectural firms have reached some level of Revit use. Now we need to start looking at gaps in the process that result in redundant modeling and wasted time/money. The main gap architecturally is inside of early design but sometime this fractured workflow bleeds all the way through DD. Making your CD milestones extremely difficult to reach. We have all seen it, the Macleamy curve, more work up front and a nice easy taper into the end of CDs.

I was one of those people that knocked down the Macleamy curve saying that it doesn't really apply to the realities on completing projects. Basically that workload always crescendos at the end because of the necessary reliance on 2D views and documentation embellishment as well as inevitable last minute changes. If the models we created were infused with all of the detail information and annotation then yes, I could see that. But I have not seen a Revit model that even nearly qualifies. To me that is an unrealistic expectation. So instead of looking at the Macleamy curve in hours or workload I look at it now as decisions. That is when the curve becomes clearer to me. More decisions up front so that the model calms down and construction documents can be mostly about documentation instead of active modeling. This would also greatly benefit the other disciplines that are using the architectural model to complete their own model and subsequent documents.

So what is going on during the ebb that we see in the curve during the SD and DD phase? Information input and decision making reliant on analysis of that information. In other words, modeling. Model early, model schematically, don't redundantly keep up multiple models. Streamline so as not to fork your curve.

Ask yourself: What is the first 2D form your project takes? What is the first 3D form your project takes? What kinds of questions can I ask to make better design decisions? What kinds of input do I need to answer those questions?

I am not suggesting that designers exclusively design inside of Revit. Designers should design where they are comfortable. I am suggesting that designers need to be aware of how far they are modeling the project and for what purposes.

This might seem a far cry from the Sustainability Frontier subject but I assure you it is not. In later posts I will delve further into using these early design models to drive better decisions and workflow considerations during this transition.

Tuesday, April 24, 2012

Is Navisworks It?

Right now the interest that we are seeing in Navisworks has expanded exponentially. I attribute this to a handful of reasons. The Autodesk's Suites have increased the overall reach of Navisworks by putting licenses in the hands of Revit users at a fraction of the cost compared with purchasing completely separate packages of software.  Navisworks represents an extremely important platform for a BIM. We can now rely on the strengths of Navisworks as a near universal model aggregation tool, including integration of DWF or DWG sheet files. Navisworks can truly hold all 2D and 3D intelligent information and even "connect the dots" digitally between the different file formats of the same project. There is also several tiers of Navisworks that allow for the users that need to run and track clashes to have access to a deeper version of the software, and those that need to view and markup a free tool to do only that.

Get it? The Clash
Most new users want to use the Clash Detection tools inside of Navisworks Manage to collaborate digitally prior to construction. This, above all other functions, is the most valued of Navisworks and only available in the Manage version.  The Timeliner tool is a distant second in unique functions although, considering the improvements in 2013, it has the ability to pick up many construction simulation modelers looking to more seamlessly integrate data and geometry inside one comprehensive tool.  Plus, there are more applications that can tap the data into other formats (e.g. e-SPECS for Navisworks) for many different uses of the model. 

The backbone of the solution is the bevvy  of formats that read well into Navisworks. Keep in mind: geometry fidelity and the quality, organization and consistency of data inside of the geometry can make the difference between a easy integration or a process that is much more manual. Simply put, it all depends on what you put in and what you are asking it to do. Just because you can doesn't necessarily mean it is worthwhile to do. Pick your battles. There are also security features to Navisworks files that protect intellectual property and can serve as reliable record copies.

The simple tools for complex projects  that Navisworks packs, as well as the increased saturation in the AEC market, creates a real need in the industry to know the simple strategies to get the most out of the tool. This usually comes in the form of simple tips and ideas around setting up an iterative approach and generally getting your ducks in a row for digital coordination.

The point is Navisworks has the potential to be the platform for model aggregation for BIMs of all shapes and sizes, from all different locations. It has both a free solution, it has a comprehensive solution. 2D, 3D, 4D, 5D. Seriously. I remember the first time I told people that I work in software that goes to the 5th dimension they looked at me like I was an idiot. And while it is an overly clever way to say "time and data" on top of the 3D, it is true.  I would prepare for Navisworks to become a fixture in the digital design, construction, and facilities management practices for the foreseeable future.

Thursday, April 19, 2012

The Sustainability Frontier

Today I am speaking at HKS's Green Week on BIM and sustainability. During the preparation for this event  a number of topics that seemed a good fit for this blog were uncovered. So this is the first in a series of posts on Moving the Sustainability Frontier with Software.

I read this a while ago and it really stuck with me, the sustainability frontier  is a repurposing of the article referenced below:

"The productivity frontier is the sum of all existing best practices at any given time or the maximum value that a company can create at a given cost, using the best available technologies, skills, management techniques, and purchased inputs. Thus, when a company improves its operational effectiveness, it moves toward the frontier."

Porter, M. E. 1996. What is a strategy? Harvard Business Review (November-December): 61-78.

Sustainability Frontier
The sustainability frontier is the sum of a firm's best practices at any given time or the maximum sustainability that can be designed and measured at a given cost, using the best available technologies, skills, management techniques, and purchased inputs. When a firm improves its operational effectiveness, it moves its frontier.

The key difference between Porter's definition of the productivity frontier and my repurposed sustainability frontier besides vocabulary is the productivity frontier is determined by all existing skills, practices, technologies and is absolute across an industry sector. The sustainability frontier is a frontier set per firm that aligns with their specific goals. The sustainability frontier will then move not merely by more technology or skill being present but how a firm  chooses to use whatever resources they have at any given time. Business move toward a constantly changing frontier, an AEC firm moves its own frontier as new methods and technologies are implemented.

I want to finish this first installment regarding the Sustainability frontier with a concept I stole from Bill Gates TED talk on reducing our carbon footprint through the use of renewable energy generation.

Number of People x Services per person x Energy per service x Carbon per unit of energy

The idea is that is we need to get any one of these at or near zero. The number of people is increasing, the number of services per person is increasing; these things can't really be effected. In the AEC industry we have a real opportunity to effect energy per service by using high efficiency materials and systems as well as using our design expertise to drive a more efficient form. Carbon per unit energy is also something we can impact through the use of PV panels and wind generation.

Stay tuned for more on this subject in later blog posts.

Tuesday, April 17, 2012


Conversations about BIM are funny. Even firms we are there to help can come across a guarded about heir BIM process and experience. I used to think it was a distrust I was sensing, after some contemplation and a lot more experience I believe it has much more to do with trying to manage a perception of BIM experience in an industry that is supposed to be inundated with it. I can tell you that mostly the industry is not inundated. There are very progressive groups and very  intelligent/experienced people, but a vast number of firms are posing as BIM savvy or even worse "fully BIM Implemented". Actually now that I think of it I'm not sure which is worse.

I cannot tell you what damage that does to the industry as a whole. In fact it makes it almost impossible to manage expectation and turns a provable credential into a magic show of smoke and mirrors. Even those that really seek to maximize their BIM benefit are forced to exaggerate the ease of BIM and downstream benefit.
Unfortunately I don't have the easy button about this one, its just a problem. Around the Dallas IMAGINiT office, we use the term "BIM-washing". The "yeah, yeah, we do BIM" that so many firms respond with when pressed. These are usually the statements of leaders/executives in a firm who have in the past purposely distanced themselves from the trenches of software technology integration. This is a GC and Architect problem mainly, it is much harder for Structural, and MEPF to hide this skill gap when they enter into a contract.

This is an oldie, but it fits so well. Hilarious part is that this character's name is Tommy Flanagan (I'll be quick to point out it is a different spelling and pronunciation, hilarious though).
Those firms over promise, under deliver, and in many cases under bid when the modeling scope expands but they don't know any better. Generally they bring us all down. Now I know I'm preaching to the choir a little here, or a lot. But maybe it is time to force the hand of every firm to put their experience out there. Maybe its time to stop selling BIM to owners and start educating owners about practicality and honest benefit without expanding the scope 5-fold. I'm just riffing here but this is a major frustration of mine. When "it" hits the fan we are the guys they call. It usually goes like this: "Hey, I won some BIM work, so I need BIM. The project starts in a week, when can I get in a class? The class comes with the software right?" In a sales environment this seems like a perfect situation but it is not. They put us in the same  somewhat impossible situation that they have stumbled their way into.  It isn't a switch you flip, there are many caveats many things to understand, many things to discover for yourself. It takes time. Even if you hired us to come to your office every single day, or hired 1 or 2 experienced people full time, it just isn't that easy. Ok well now I'm just venting.

Let's stop BIM-washing together. Help me figure out how. Help yourself. Help the industry.

Friday, April 13, 2012

Long Lists: The Death of Revit Productivity

Revit project files are massive repositories for information. Views, data rich model geometry, annotation elements, schedules all existing in one location putting a stunning amount of information at your fingertips. This amount of stuff to understand introduces some productivity challenges. These challenges are not overcome without some well practiced methods and a deep understanding of where your model bottlenecks for the users. One of the most tangible metrics to measure a progressive Revit process and standards adherence is to look at the length of the different lists in the project file. This is about ease of use and standardizing your views and content.

Type Selector  - Lists seem to bloat for a couple of reasons: There are a ton of objects loaded into the model  including duplicate types and Type properties have been used to define important, schedulable object properties. Purging periodically works as long as you don't have the duplicate types placed somewhere in the file. The other thing to consider is if type properties have been used too heavily when an instance property might provide a useful flexibility. There are pros and cons to shifting your parameters around so do so where benefit is most easily gained (e.g. door schedule)

Line Styles - Even in a default template this list starts longer than I would like with few options to delete or rename the out of the box line styles. One can only hope to intelligently order the line style types so that they easily and logically sort themselves. If you begin the custom line styles with a number (e.g. 1-Solid, 2- Hidden) they are easy to sift through and order themselves at the top of the line styles list. Tip: The number doesn't have to directly correspond to the specified line weight, reserve the thinnest line weights for patterns and items in the distance.

Project Browser - There are a few areas here to pay close attention. First duplicating view types (in 2013 available for Floor Plan, Ceiling Plan, and 3D views) was always something that worked well for architectural sets. Many firms have a custom view and sheet browser organization often utilizing custom parameters sorted. My only tip here is not to overload your views with multiple view purpose parameters. Keep it simple with Discipline, and Sub-discipline, and duplicated view types to shorten up those view lists. The Project Browser can also be useful when placing families because they are all organized by their object category. For instance, when placing a component from the type selector you have a list that includes plumbing fixtures, generic models, planting, etc. all in one place. The project browser allows you to see the list expand by category you can click and drag the types into the view to place an instance.

Detail Components - These components are all the same object category so you have to be very mindful of naming conventions as you populate you model with these 2D components. These lists can be very long and without a naming convention to sort the list of components users will really struggle to work quickly and consistently with these.

Tuesday, April 10, 2012

BIM is Software: Part Deux

One of my very firsts pots on this blog ended up being one of the most viewed and commented on, BIM is Software. Admittedly I was looking to ruffle some feathers with the title alone and the minimal amount of promotion of my blog I did on Twitter quickly uncovered an adamant population seething over my "silly" assertion. In my first afternoon on Twitter my idea was called "stupid" and I was mocked as a "so called expert". I found myself in a heated exchange with strangers over something I was trying to point out was simply a disagreement in semantics. This was somewhat of a baptism by fire for me on the social media side of things.

Here we are a month later and I wanted to revisit the post, amend, and prod further. The practice of archtecture, engineering, and construction is not defined by the tools they use. BIM is NOT any one of those things. That being said BIM-ing happens in software, the BIM exists in a piece of software, process can be refined in and out of software but it is about how BIM enables through software.

The three pillars of these types of change are People, Process, and Technology and you might say that technology is the only thing that relates to software and I adamantly disagree. It is all software. The software is often easy to learn and use, understanding how to leverage the software tools to their maximum benefit is much harder and requires knowledge of a firm's unique goals, skills, and existing workflows. But in the end it is about the software.

There is art and science at the heart of the industry's time tested practices and I am not trying to change that. What BIM does requires software to benefit the art and science already in play. To see the software a simply  a tool is  fairly shortsighted, it isn't just a better instrument it changes the medium. To summarize art is art, a paintbrush is a paintbrush, a hammer is a hammer, BIM is software.

Finally I do want to give credit where it is due. I thought TroyGates summed it up best with his broad definitions of the 3 major acronyms of our corner of the industry: BIM, VDC, IPD.

IPD - Relationship
Integrated Project Delivery is the arrangement of parties into a unified team to deliver the project, sharing together in risks and rewards to remove the adversarial relationship between the owner, builder and design team.

VDC - Process
Virtual Design and Construction is the management of the product, the process and the organization of the design/build/operate team to develop a complete, virtual description of the entire project.

BIM - Instrument
Building Information Modeling is the digital representation of a facility’s physical and functional characteristics. BIM tools are the means by which we harness, communicate and leverage that information.

The application of all 3 from the bottom up: Just as Revit is not required to do BIM, but is one of the best enablers of it, BIM is not required for VDC but significantly benefits it. Likewise, IPD does not require the use of VDC but is facilitated by its use and considerably constrained in its absence.

Well put Troy. You can follow Troy on Twitter @TroyGates. 

Thursday, April 5, 2012

What Do I Mean By Model Integration?

I wanted to take a post to explain a commonly used term that carries a lot of importance as software tools improve and interoperability increases, model integration. Model integration is the ability to take a single model and use it for design, analysis, documentation, virtual construction and finally facilities management. Each of these will be explored further in future posts but let's start by taking a look at the opportunity and challenges we face in this effort.
It is completely irresponsible of me to tell you that this is, top to bottom, entirely possible and incredibly easy. At the very least it takes a very pointed LOD document and a fair amount of model repurposing. In addition to that it might take a good deal more scope, so be prepared when the client asks for a model they can use in FM, it might mean a whole lot more modeling and data integration. There's that word again, integration.

Now let's take a look at what I consider to be the major hurdles to model integration:

Willingness to Share Models - This was a bigger problem 2-3 years ago, but the fact that sharing a Revit model means all of your content goes with it make people understandably nervous.

Consistent Modeling - If you don't know what fidelity of model you are getting how could you reliably use it? Many in the industry have taken the dive into the LOD document, but how do we make that more than just an addendum to a contract? How do we bring that into the modeling habits of the ever day user?

Entrenched Belief - In such a quickly changing industry is it impossible to assume the quality of the model you get today is the quality of the model you get tomorrow. Don't make rules that are hard to turn back on. Several years ago a General Contractor would scoff at the idea of using a designers model. The most progressive of that group would simply reply "I'll model it myself".  Now when we show contractors the easily leverage-able elements in a Revit model most can't believe what they are seeing. It won't replace every manual effort, but certainly some if not most.

Limitations of File Formats and Software Capabilities - This one mainly has to do with solutions being over sold on their "automatic capabilities". Software is software, I am not trying to change the software, I am looking to saddle it and take it for the most beneficial ride I can. Great strides have been made recently to make models more friendly to construction modeling repurposing and more is coming on integrating structural and building performance analysis. I get stuck in the "It's 2012! Why can't t do that yet." But that doesn't matter in the end. Keep in mind this is a stagnant technology, today's issues are tomorrow's new functionality. The sky is he limit.

I hope to explore more detail model integration process and issues for each of the 5 types of models during the course of this blog. Stay tuned.

Monday, April 2, 2012

Waves of the Future?

Every year or two it seems to be something different. Another epic shift in how the AEC industry will do business: BIM, IPD, Cloud Computing, etc. Not to say these technologies haven't made a fair impact, but they fell far short of the sweeping change most speculated was to take place. Not to say things are stagnant but there is something eerily familiar about each new idea. The underlying sentiment always pursues data integration and access.  This most likely will happen at  pace that the industry can adapt to naturally as it has done with CAD and now BIM.

 A good friend and colleague of mine, Don Bauman, pointed out to me the common trend of data integration in its many forms throughout the late 20th Century and still today. Don's industry experience and longer term perspective has been invaluable to me. Not to "out" Don's age or anything but he has been at this quite a bit longer than I have. Let's just say Don was spreading the good word of technology while I was in my Spiderman Underoos playing He-man (17 was such a great age).

No matter what the technology, or the semantics that surround them, finding better ways to integrate and access data has always been the goal. People, Process, and Technology will always be the mechanisms (aka hurdles).

It does feel very exciting right now: the conversations, the challenges that are presenting themselves, and the industry professionals that aren't looking for an excuse but looking for an opportunity. When I was a kid I wanted  be a doctor, I remember thinking "there will always be something new to discover, when some problems are fixed others will arise". Now I am a AEC software technologist and luckily for me I feel the same way. Its almost as if I had followed that childhood dream just minus all the schooling, insurmountable student debt, and prestige. That and I don't have to give anybody a prostate exam or examine any sort of terrible rash which I am also grateful for.

Tuesday, March 27, 2012

Revit 2013 - Droppin' Like You're on Subscription.

It's official, Revit 2013 is here, or coming, or at least we can talk about it now. In reality very few firms decide to move to new versions of software until the Fall, Winter, or Spring following the release. So these new features might not be implemented in your firm for quite a while but without a doubt it will happen so let's discuss a couple of items.

This one is personally my "holy crap!" improvement of 2013. They have been tweaking the material dialog box for a couple of releases now and I always scratched my head at the changes which seemed somewhat cosmetic and only marginally beneficial to the everyday user. Certainly it never improved in a true game changing  way; that is until now. I do a lot with energy analysis and the dreaded gbXML file and the complaint from most is the lack of intelligence to derive thermal characteristics from the modeled elements. You always had to pick from a static list of construction assemblies whose assignment were determined by an adjacency calculation not from what was modeled. Now you can calculate thermal properties of layered elements based on Thermal assets of the individual materials in a wall/floor/ceiling/roof system family. Component families (doors/windows) are still picked from a list as a type property. You still have the option of assigning thermal assembly types in the traditional way but now we are one crucial step closer to intelligent integrated analysis.

Stair Type Properties and Their embedded Type
properties dialog boxes
Stairs can be modeled with run and landing components or as before with a sketch. The components are really a fun and flexible way to model stairs that help you understand what your stairs will look like in 3D before leaving your create stairs mode (images below are while still in the stair creation mode). In addition to the new ways we can create stairs, they have also provided us the flexibility to convert any component based landing or run to a sketch. The stair types now have  embedded type properties dialog boxes for the landing, support, and other elements that define the stairs.

 I don't mean to oversimplify everything that is going on in these releases but  only have so much time to write here. If you want more information attend one of the web based presentations that will take a much deeper dive on each discipline. IMAGINiT's Know It. All.  Virtual Event will be held at the beginning of May 1-3 with 1 day dedicated to each market sector (building is on May 2). As always, stay on top of the new features and keep opportunity in the forefront of your mind as we go through another year of software releases.

Finally, if you are not using the 2012 version of Revit yet? Get with it. It is time. You are only hurting yourself, and your MEP consultant. 

Monday, March 26, 2012

Why Would You Model Gyp Board?

Ok, time to get a little more technical and I guess quite a bit more nit-picky. Before I begin I will just say that I have completed jobs in the industry as a production guy and as a BIM manager at first including Gyp Board modeled in walls and then acquiescing and going without. What I have found for myself seems to go against the grain of how most firms I see using Revit model walls.

I am a recovering Gyp Board modeler. Its known on the streets as "sheet rock", "Gypsum", "GB", "white wall candy".  Well I am off it and I am not looking back. Ok, now lets take a look back.

In the early days of BIM it was cool to see what you could model, the possibilities were endless, base board, casework section information, roof drains, 3/4" reveals. Then, the realities of system performance and general practicality sunk in. To model the minute means to over-embellish the overall.

Gyp board seemed acceptable, I mean what if I want to do a material takeoff of all the gyp board on my model? I had core boundaries if I wanted to dimension to the stud face so that worked, everything was great. These were the salad days.

Unfortunately, these days were short lived as the workload that needed to be undertaken to realize this dream was, well… realized. It forced me to make many more wall types and take what should be a very simple and repetitive area of the floor plan to model and turn it into something far more complex. And for what?

What about when gyp board stops above the ceiling but the wall structure keeps going? Are we really unlocking the gyp board layers and modifying that everywhere, not just the documentation views? Its no wonder firms want to up their fees, its no wonder there is significant concern over modeling scope and budgeting hours. What about when the model changes? Too much work for too little benefit.
The data is what's important: Type Marks, Partition Type Legends, Fire Rating, detailed views.  How could an estimator effectively use your model without gyp board? Easy, they probably weren't using it in the first place. Linear feet of wall, Areas and Heights of rooms, these are the limitless possibilities chosen by the people that most directly benefit from your modeling workload.

 I've been gyp board free for 3 years now and I can't tell you how good I feel.  Join me.

Thursday, March 22, 2012

Owner Standards

One thing that I can say is lighting the fire of change across the industry is the owner BIM guideline. If you haven't been able to pour through one of these documents I highly suggest you do especially before bidding on one of these projects. It can drastically change your scope of work and if you are not aware of that impact you run the risk of losing you shirt on these projects. In my opinion these guidelines have the potential to make AEC firms more profitable by increasing the workload/fees, and therefore the value of a delivered model. Message: Don't fear the guideline, saddle it and take it for a spin.
Here are a few examples of these owner guidelines for your reading pleasure. These are in no way meant to represent the full range of owner BIM Guidelines.

Monday, March 19, 2012

Why Vasari Makes Me Happy

With the new Vasari Technology Preview Version 2.5  available free from Autodesk Labs (link to Vasari 2.5 website) I wanted to take a post to point out some of the new improvements as well as to share my general affection for this software.

I would like for Project Vasari to be taken seriously… eventually. I have done a lot with energy analysis and energy modeling during my time at IMAGINiT: Ecotect, Integrated Environmental Solutions <Virtual Environment>, Green Building Studio to name a few. The sad fact is the GbXML file from a model built for documentation is not viable for most energy modeling and analysis. Don't get me wrong, I can make it work; but when I take a look at the simplicity that I need in an energy modeling application and the intricacies of a building model in Revit the best of us balk. Vasari allows me to be generic and still tangibly communicate something powerful.

By Vasari I really mean Revit.  The tool is easy to use for existing Revit users, uses RVT files and has the same general interface and features, with some advancements over the Analysis tab that was included with the 2012 release of Revit. Vasari can be used to integrate early design modeling and leverage-able analysis and data into a meaningful and cost saving BIM process.  Here are my thoughts on some key features.

Enabling an Energy Model - The 2.5 version of Vasari greatly improved the automatic zoning. The 2.1 release saw the Divide Perimeter Zones option abide by the ASHRAE 90.1 Appendix G Thermal blocking requirement and the 2.5 version has improved the  Core Offset functionality to correctly zone courtyard spaces. Basically a higher fidelity model that would therefore give you higher fidelity results. Or export the Mass model GbXML file and use it wherever. This, if it continues to improve and maybe include some per zone modeling flexibility, could really be a game changer in the building performance analysis software market.
Ecotect Wind Tunnel - This was released with 2.1 this past Fall but it is too interesting not to mention. External CFD: airflow around buildings, through courtyards, very cool visuals. Unique and simple in its execution. The only question I am left with is "what could I use you for?".

2D slice shown, can be 3D flow lines, and runs as an animation.

Ecotect Solar Radiation Tool - This one has been around a while but its worth noting. If only I could use it on a building model and not just a mass model. Interesting information for design taking advantage of sun and shade from the surrounding site and form based self shading.

Working in perspective view  - New in 2.5. Its kind of like Navisworks Orthographic and Perspective 3D view options, also reminds me of another free modeling application from another massive software company.
    3D Modeling  - I personally don't get too revved up about improvements to the massing tools, I guess my needs are simple. Massing in Revit always seems to turn into a practice on impractical architecture, but it sure is fun.  The message should be: it doesn't matter where your design starts, you can quickly mass its shape and get some great data. Regardless of that design options, schedules, parameter driven mass forms, all very cool and all absolutely possible.
      Export to FBX - Mass, analyze, and render. FBX files allows you to take your mass form with materials into 3DS Max design for photorealistic renderings. I think the conversation needs to be revisited: What are we showing our clients, and when are we showing it to them? This turns the early design deliverable on its ear, unique and meaningful design communication with less upfront workload.
        Vasari has proven itself to be provocative, continually improving, and just plain cool. As modeling expertise expands in this industry I hope to see more architects dabble in these sorts of massing forms and the data/analysis/visualization that is easily gleaned.