Plantelligence and Ecomimesis

Animating Nature

Within a city like Baltimore, the landscape is generally considered the ‘background’ for human activity – a largely undifferentiated expanse of green, without much thought about the actual plants that fill the space. UMBC Professor of Visual Arts Lynn Cazabon moved those plants to the foreground, working with the IRC to explore the ways that plants respond to the environment around them during an IRC Summer Faculty Fellowship.

“Plantelligence emerged from my interest in how plants perceive and respond to events occurring in their surrounding environment, as a means to bring attention to how global warming impacts the way plants and in turn urban landscapes are currently evolving. Recent research in the field of plant neurobiology has led to inquiries into the evolutionary purposes of the many ways that plants sense their surroundings, including through analogues to sight, hearing, touch, taste and smell as well as through perception of electrical, magnetic, and chemical input. (Cazabon)” But, plants move and react at speeds below human perception—and this is where the IRC’s research in photogrammetry, 3D modeling, animation, and virtual reality came into play.

Fascinated by its adaptability to the stresses of living in human-created landscapes, Cazabon focused on one common species, Conyza canadensis, better known as horseweed. A plant that most people consider a nuisance, Horseweed is a native annual that thrives in urban areas.

Plantelligence first took shape in the photogrammetry rig, where the IRC took scans of growing horseweed every 30 minutes for about two months, ultimately generating 8 terabytes of data. But the plan to create a 3D timelapse film ran into a few snags. The first was that the plants did not grow as quickly as we would have hoped. The second issue involved the time needed for the scans themselves: it took the computer 3-4 days to process a model from each scan. It would have taken far too much time to process every single scan.

In addition, the models that were initially generated needed a lot of cleaning up by hand. Technical director Ryan Zuber calls these irregular models, with holes and deformation, “crunchy,” and went to work smoothing them out. Instead of working with multiple scans, he cleaned up one model plant, imposing quadrangular polygons on its surface, which allowed the model to be textured and animated.

But, as Zuber and Cazabon realized, it’s not easy to create and animate a realistic plant that is designed to be seen individually and up close. The horseweed has many leaves, tiny hairs, and variable textures, all of which need to be able to move independently of each other, and all of which need to be seen at multiple stages of growth. Zuber treated each leaf as an individual ‘character’ and built a rig that could work for all of the leaves, regardless of their specific geometry. He studied time-lapse films of plants growing in order to get a sense of the way the leaves grow and unfurl, in order to animate the plant.

The next step involved placing the plant in a VR space where people could interact with it, which Cazabon envisioned as a generic, unadorned gallery space. The goal here was to bring the outside to the inside: to isolate the plant against a neutral space that is ideal for human perception. The final step was to bring the animated plant and gallery environment together with custom software that enabled a viewer to explore and interactively affect the plant’s growth.

In the summer of 2018, Cazabon displayed an updated version of this project, now called Ecomimesis in the exhibition Hustle at the Science Lab Gallery Detroit. An initiative of Michigan State University, the Science Lab Gallery Detroit is part of a network of international galleries featuring projects that blend art, science, and technology aimed at reaching young audiences. The animation was customized for the Hustle exhibition with the virtual space designed to mirror the physical details of the Science Gallery Lab, so that as viewers don the Oculus headset they will see over a dozen plants emerging in a virtual and slightly idealized version of the space in which they are standing. The viewer’s body is intentionally not represented in the animation, resulting in an intimate encounter with the plants as the viewer floats around and merges with them. Ecomimesis was also shown at two adjacent stations in the gallery, but each viewer’s experience is unique. The plants appear at randomly generated locations and at varying points in their growth cycles. As one plant dies, another emerges.

Researchers and Creators

Project Director: Lynn Cazabon
IRC Technical Director – Modeling: Ryan Zuber
IRC Technical Director – Programming: Mark Jarzynski
Photogrammetry and Programming: Mark Murnane

Imaging Research Center, UMBC © 2024