Monday 14 March 2011

2D23D

I first came across this post on Zach Kron's Buildz blog well over a year ago, and short of anything to write about this week I find myself retuning to it. The post briefly demos some of the cool things that can be achieved by an API addition to Revit, mapping image data into 3D forms. Its nothing new, mesh modelling packages have been doing something similar for years, but I have been meaning to have a play with loading hand drawn sketches into the plug-in, and mapping them onto a basic grid.

Basically, I am interested to see if I could I generate a rough 3D model based purely on my doodles? Jennie wrote an article last week discussing how scanning your scribbles archives them, but doesn't really wrap them into a coherent workflow. With my tendency to use felt pens at the programmatic stage, I would love to see if I could get even simple massing models to jump into life on screen. This seems very straightforward, and coupled to into a graphics tablet I could see myself going through a lot of options very quickly, just doodling my thought process, in a fashion that is quite difficult to do in Revit normally, rearranging 3D components.

But could a little development work take the concept further? In the same way AutoCAD assigns different properties to different coloured lines, could a plug in read different colour pens differently? Could I even begin to mesh together sketches of plan and section into conceptual form?

No comments:

Post a Comment