The official title for SAA2007 Session 106 is Constructing Sustainability: Real-World Implementations of Preservation Standards for Born-Digital Design Documentation, but I think it might have been better served to include the word Architecture somewhere in it’s title. Sponsored by the Architectural Records Roundtable, this session considered issues related to preserving born digital records of “the design community”. The design community in question includes both architects and landscape designers.
Each panelist gave a 5 minute brief about the way in which they are working toward preserving these design community records – and the rest of the session was opened up to Q&A. David Read, the session chair, mentioned how they used a wiki to collect questions and ideas for the session, gave an introduction to each of the panelists and helped guide the Question and Answer portion of the session.
Who was on the panel?
- David Read (Session Chair, Information Resources Manager, DiMella Shaffer )
- Phil Bernstein (Autodesk, Architect and Technologist)
- Carissa Kowalski Dougherty (Art Institute of Chicago, Department of Architecture and Design )
- Annemarie van Roessel (Columbia University, Avery Architectural and Fine Arts Library )
- Dennis Newman (general manager at PFS Corporation , member of PDF standards working group of ISO)
What is being done?
Phil Bernstein kicked off the 5 minute summaries with a quick history of design technology. He explained how currently there is a shift in progress. Hundreds of years of paper drawings were followed by ten to fifteen years of electronic drawings. The latest development is use of Building Information Modelling (BIM). BIM relies on a database that generates ‘reports’ that are in fact ‘drawings’. These are sometimes referred to as Building Development Information Models. Digital printers can produce physical models directly from the stored BIM data with no need to step through generation of an actual drawing outside the computer.
Phil showed Yale School of Architecture design examples from the BIM world. These were fantastical organically shaped creations that looked more like strange undiscovered plants from under the sea than traditional buildings!
The good news is that the data in the BIM databases are all just text. The bad news is that the generated ‘design artifacts’ are based on the text data and can lead to digitally printed artifacts. There has been an explosion in the various means of representation. The architecture world is catching up to the to other industries (such as the auto industry) that have been doing this for 25+ years.
Current architects are application agnostic – they don’t care what they use to create their outputs. All the paths and platforms will only grow – what is driving the design process will be increasing in complexity. The building industry is making a fundamental shift from electronic drawing to the Building Information Modeling approach – but there is an unlimited environment for representation. He hoped to discuss the intersection between the archival/record keeping issues and the problems facing the architecture world.
Carissa Kowalski Dougherty’s overview covered the Digital Archive for Architecture (DAArch) project out of the Art Institute of Chicago . The project was based on the 2004 study Collecting, Archiving, and Exhibiting Digital Design Data. They considered how Architecture and Design firms are using software tools to produce and design – but examined these questions from a museum and curatorial perspective.
The recommendation is a two-tiered collection approach.
- First tier: Native files – like autocad files – these are going to be preserved at the bit level – but there is no commitment to ensuring access to these files
- Second tier : Output formats – only pdf and tif files
PDF: line drawings, vector-based graphic files, text documents power points
TIF: renderings, digital photographs
The second tier outputs are what they are committing to “functionally preserve”.
Carissa presented an example of what they accessioned from the Garofalo Architects‘ Manilow Residence (2001-2003) project. A lot of what they got were files that no-one (including the small architectural firm itself) could still open.. the software is gone. Another major challenge was poor naming conventions for the files themselves. The final project archive included over 200 native vector 2D files (.dxf, .dgn, .dwg), 145 pdfs.. and more.
From the UrbanLab they sought to preserve their Visitor Information Center Competition Entry from 2001. This was a project that was never built and therefore has little physical output. They mostly used autoCAD (2D), Maya (3D), FormZ (3D) and Adobe Illustrator (layout).
The DAArch Software highlights:
- browser based
- DSpace as back end
- Dublin Core augmented with CDWA and custom metadata to support architecture data and digital materials
- authority records
- group and item level cataloging
- will be available open source with BSD license via SourceForge (this was a requirement of the funder – that it be open source)
Final lessons and challenges from the DAArch project:
- file naming and organization – the biggest challenges at the smaller firms – need outreach to these firms
- metadata for digital objects – there is not a lot out there for 3D digital images
- software and migration tools – can we/should we preserve the software dependent first tier files? or just the PDF/TIF outputs?
- three-dimensional objects, BIM, animations, etc
Annemarie van Roessel discussed Columbia’s major Manhattanville project. Their goal is to make digital records last as long as steel and glass. The Avery Architectural and Fine Arts Library is feeling the pressure to be a leader, so how does Avery document this project? Manhattanville is a 30 year planning, design and build project targeted to be completed in 2030. It will cover 17 acres northwest of the main Columbia campus.
There are many building blocks to the digital design archives: autoCAD, project management records, collaborative environments (sharepoint – Microsoft), images, presentations, websites and movies (ie, more than just “scary CAD drawings”). They are planning staged preservation points. The Avery is committed to developing capacity for digital archiving by 2009. For their metadata they use at minimum the mandatory DACS elements mapped to Dublin Core elements.
Dennis Newman was the final panelist. He has clients who need to preserve/archive finished drawings – such as the documents being sent along to regulatory agencies for final approval. PDF/A-1 was based on ‘electronic paper’ – you loose lots of data when you ‘cut back’ to PDF-A. PDF-E is in it’s first draft/generation being submitted for version 1. PDF-A didn’t address 3D, complex metadata or moving images. PDF-E is based on Acrobat version 7. Adobe has thrown out PDF to the ISO community. Dennis believes that the final ‘as-built’ drawing is what should be the archived version.
He pointed out that Stage I responders need more information than the regulator commissions need. Since 9-11 the state requirements have changed about what need to be in the ‘record’.
As an IT professional he was asked “what can we do” and his answer is “how much do you want to spend?”. IT can do anything – but it takes time and money.
Questions and Answers
Keep in mind throughout this section that I was summarizing the questions and their answers as best I could. Please do not take any statements attributed to the session speakers as full and complete quotes. In cases where I missed too much of the question or answer I generally skipped including it in the list below. If you are anxious to know exactly what someone said, you would need to buy and listen to the conference recordings for this session.
QUESTION : Could a neutral exchange format such as International Alliance for Interoperability‘s (IAI) Industry Foundation Classes (IFC) be the foundation or a piece of the next step in preservation of born digital design documentation? Text + data model that could be read by different software (import/export of data). You can do this now with AutoCAD – you can dump into IFC.
Phil: Is a neutral exchange format the answer to the archiving problems? Software is changing so fast that there is no way that a standard could keep up with it. Also – even if all the data in the world could be put in XML – you still need something to ‘read and do something’ with the data. He put the business process diagram on screen from his talk and pointed out that all the different tools and their outputs exist within the CONTEXT of the business process itself.
Carissa (?): IFC is a recommendation of the Art Institute of Chicago
QUESTION: William Reilly from the FACADE project started to ask about the challenges inherent in the fact that the IFC standard only gives you the geometry. There was some back and forth about this idea with voices noting that IFC can capture more than that.. but not everything.
Kristine Fallon: The idea of doing a neutral format for complex information is a complicated thing. Going back probably 20 years, the people working on data exchange standards for engineering … the different software won’t perfectly talk to each other – but what they can do is exchange ‘model views’. The IFC data model is capable of a fairly comprehensive set of model views.
QUESTION : Who is going to keep it up in 20 years? Are the software producers going to keep it up?
Phil: Autodesk spent 5 million dollars in building the IFCs. If the archivists align their needs with the business needs then the business will pay for it and the archivists will get what they need.
Annemarie: The archivists don’t have the money and resources.. even at Columbia they don’t have the money to buy generation after generation of the software to read all the different file formats. Maybe the MIT approach of emulation is a better approach.
David : Will there ever be a day that I will have an emulator on his desktop? That makes me more curious about exporting pure text.. I can get my head around preservation of that.
Annemarie: The Mannhattenville project is the first step for Columbia in collecting digital data. Archivists need to reach out to organizations now to explain that they want to preserve what they are creating. I am being honest about the chaos coming down the track when we start getting the data from the 90s.
QUESTION (from the audience): The function of IFC is not for archiving.. it is for different software products to communicate with one another. How do you figure out what artifacts of the design process do you keep? How do you extract the ‘important’ parts to keep from what is ‘less’ important?
Phil : What about when there are physical digital models, analytical models and more.. how do you understand all of it?
Carissa: The architectural firms need to be able to get to all of this too. It isn’t just archivists who should be caring about access to all these models. There are legal ramifications and the possibility of renovations later… this needs to come out of the architecture profession.
QUESTION (I asked the following question): Are the problems in preserving the final products so challenging – are there any thoughts to trying to preserve the process. With paper there is an easier preservation of the evolution of design.
Annemarie: In the Manhattanville project one of the big challenges is the architect who does lots of self editing. In many cases they don’t want the word to see their interim choices during the design process.
Phil: Digital tools can encourage you to explore useless ideas. Keep in mind that the journal file for the Building Information Model keeps track of every change. It will tell you that on Tuesday at 4:10 pm someone moved this door 5 inches to the left.
Carissa : At the art institute, architect and archivists need to work together to figure out what is worth capturing.
David: Two different schools of thought. Archiving the final product or archiving the process. File formats are preserving the final product.
QUESTION : There is danger in keeping everything – the goal of archiving is to keep the best final version. The big hulking databases of the world open the door to keeping an overwhelming set of unimportant data.
Annemarie : the needs of all their different consumers are so broad. Perhaps the taking a snapshot should happen more often – thinner slices
Carissa : 2D snapshots are not going to capture the fullness of a 3D object. But it isn’t capturing as much as it might.
Phil : There could be an interactive digital simulation that generates 3D models.. there could be no ‘final’ product. Can we have an impact on how info is kept 4, 10, 30 years from now – for the future? In a world where you can borrow (or pay for) processing time… someone will keep all the versions of autocad.. you will pay for the 15 seconds of rendering time in AutoCAD 14 from some 3rd party.
Kristine Fallon: There is a real business purpose to sorting this out… the IAI work is very real world.. defining model views can help support business.. but they can also support the goals of archivists.
Kristine Fallon‘s Question : Was PDF-E designed to be an archival format?
Dennis: No.. it was designed to be a data interchange format. People who don’t want to give lots of proprietary data to another vendor – they still need to give them a bunch of data to work with them.. that is where PDF-E came out of.
As seems to be the case with all born digital records, there are no easy answers. While events like 9-11 have had impacts on the types of final products that regulatory agencies and first responders need to evaluate and have easy access to, the speed of innovation and evolution in building design is stunning. It should come as no surprise that architects are more concerned with finding the best tools for their trade than they are with how to preserve the artifacts of their ultimate creations. They will change the tools they use when they find a better tool to manifest their vision.
The most promissing option seems to be having archivists get involved in discussions with the software developers, the architects, the builders and government early in the design process. The traditional model of archivists receiving the final products of business processes years after they were completed does not appear to be an answer on which we can depend. I suspect that proactive efforts to plan for preservation from the start will pay off – both for those trying to use the records 10 years from now and for those who want to preserve some subset of the records of the design community for future generations.
As is the case with all my session summaries from SAA2007, please accept my apologies in advance for any cases in which I misquote, overly simplify or miss points altogether in the post above. These sessions move fast and my main goal is to capture the core of the ideas presented and exchanged. Feel free to contact me about corrections to my summary either via comments on this post or via my contact form.
Thanks for pointing this out to me in my blog. I’ve printed it and will read it on the train tomorrow.
Very nice and thorough session summary. I’m surprised that from this account no one’s mentioned the recently completed InterPARES 2 Project (http://www.interpares.org) that looked at very similar kinds of electronic records.
Thanks! And yes- there was no mention of InterPARES 2 during the session. I took a quick look over on the InterPARES 2 site and spotted both the Preservation and Authentication of Electronic Engineering and Manufacturing Records and Selecting Digital File Formats for Long-Term Preservation case studies as possibly having good related materials for this discussion.
Pingback:abtown » Blog Archive » SAA2007: Preserving Born Digital Records of the Design Community …
Pingback:Capa’s Found Images and Thoughts on Digital Photographers’ Sketchbooks - SpellboundBlog.com - spellbound by archival science and information technology in the digital age
Pingback:william reilly mit
Pingback:Archivists and New Technology: When Do The Records Matter? - Spellbound Blog
Pingback:Project Management Tipps | E-Chieve Your E-commerce Project Targets
Comments are closed.