RTI Flash Assembly

Advertisements

Week 2

Week 2 Overview (July 20th, 2012):

Our second week of the project ends today. On Monday the project team traveled back to O’Keeffe’s home in Abiquiu for photogrammetry capture. Dale wanted to capture a section of the exterior adobe wall. It was mid-day and the wall he originally wanted to capture was facing a garden with large bushes and a tree. The tree was casting too many shadows on the wall and there was a slight breeze. The team did not know how this would affect image processing within PhotoScan so we moved to an alternative wall without any direct shadows laying upon it (besides some plants and a ladder).

There was also a large, recently watered garden between the camera and the wall so a 100mm lens was needed. The team positioned the camera 42ft away from the wall so that it would be fully within the frame. Ideally the camera would have been closer to the wall, but certain obstacles, such as the moist soil, forced us further back than we would have liked.

To keep the front two legs of the tripod at a constant 42ft from the wall, a chalk line was used as a guide. This enabled the team to maneuver the camera horizontally and still keep accurate distance from the wall.

Along our horizontal line, a large bush sat within our line of sight. To fix this problem, we shot at a 15 degree angle outside of the bush to capture the wall the bush was blocking; PhotoScan had no problem putting the 3D image together.

Maneuvering the shot around the bush.

After all the images were captured horizontally, 90 degrees and 270 degrees they were ready to be processed in PhotoScan.

Much of the detail was lost, probably from being too far from the wall, but the mesh was mostly clear. Some of the areas where we had plants against the wall are distorted.

Point Cloud View:

Results in the point cloud view mode.

Solid View:

Results in the solid view mode.

Throughout the week we completed additional RTI’s of three paintings “Easter Sunrise”, “Pedernal”, and “Mesa and Road East” The continuous practice of RTI makes the process faster and smoother.

Project Notes

July 12, 2012

Dale here:

Two characteristics of our project are especially important. First, it is important to us that the entire process is scientific, i.e. transparent, documented, replicable and aligned with ISO, AIC, ICOM photographic documentation standards. To this point, the entire sequence, from capture to computational image is open format, non-proprietary and open source. The metadata and computational pathway of each pixel is readable, documented, archival and can be assembled into a 3D computational image at any time in the future using whatever developments in normal reflection or photogrammetry software and camera hardware occur. To fulfill the scientific requirements of photo documentation in conservation and preservation, 3D or otherwise, there can be NO proprietary files or software conversions and the owner of the initial capture needs to own and archive all workflows using a metadata “lab notebook”. No steps can be hidden.

Second, we are principally aiming to develop reliable protocols for 3D, photogrammetrically accurate images using a CONSUMER GRADE DIGITAL CAMERAS, rather than expensive and complex laser scanning technologies. If reliable workflows can be defined, ANYONE with a digital camera with a manual mode setting, raw capture capability, and a flash and a laptop computer may be able to gather RGB color and 3-D data and assemble 3D digital surrogates that can be used to monitor very slight condition changes that occur over a very long period of time.

On the LinkedIn Group Heritage Conservation/Historic Preservation of the Built Environment Network, Graham U’ren, a trustee at Built Environment Forum Scotland, posted a comment noting the visually exciting and richly detailed Scottish Ten project, an “ambitious five year project using cutting edge technology to create exceptionally accurate digital models of Scotland’s five UNESCO designated World Heritage Sites”. The Scottish Ten project is partnering with CyArk a 501C3 non profit who uses their proprietary software to build and store 3D image files from laser-scanning point-clouds. They use highly precise, high speed terrestrial laser scanning systems, some capable of sub-millimetre data capture and aerial optical remote sensing technology. Visit their project website at http://www.scottishten.org/

About

Monday, June 25, 2012 begins a new, 8-week, O’Keeffe Museum Conservation Program summer preservation project. This year, thanks to the generous financial support of the National Center For Preservation Technology and Training (http://ncptt.nps.gov/ ), The Stockman Family Foundation and New Mexico AmeriCorps Cultural Technology community service program, we will be testing three, exciting, new technologies in “Computational Imaging”.

In conservation and preservation, computational imaging uses the power of today’s laptop microprocessors and digital cameras to create accurate, archival, 3-D images and documentation for artistic and cultural objects. From the microscopic texture of a paint brush-stroke to the undulating pitch and volume of a historic adobe wall, computational imaging can document and help monitor the changes in three-dimensional shape, size and deterioration of the historic properties and objects under the care of the Georgia O’Keeffe Museum. Since we preserve, document and monitor everything from historic landscapes and historic structures to O’Keeffe’s paintings, pastels and drawings to the tiny snail and scallop shells she collected and imaged in her art, automating and increasing the accuracy of our documentation and measurements can help us get better work done, more systematically, and in less time.

New Mexico AmeriCorps Cultural Technology (NM-ACT) interns Joey Montoya of Espanola and Greg Williamson of Santa Fe will join Head of Conservation Dale Kronkright and Assistant Registrar Darrah Wills in testing three imaging processes: Reflectance Transformation Imaging, Stereo Photogrammetry and Structured Light Imaging. Each uses different algorithms to construct detailed 3-D images and data by assembling regular, 2-D digital photographs.

To make the documentation useful from a scientific perspective, the source photographic conditions, resolution and algorithmic processing pathways must be carefully documented in a digital “lab notebook”. Further, to facilitate the accurate computer comparison of 3-D features of a painting, door, window or landscape over time – to determine if features are subtly changing in ways that indicate underlying deterioration or damage – the algorithms must be able to recognize and compare L*A*B* feature data in photographs that have been taken years or decades later with different cameras, different lighting conditions and slightly different orientations. These conditions are pretty demanding!