Scientists Translated A Spider’s Web Into Music, And It’s Quite Captivating


Spiders rely heavily on touch to sense the world around them. Their bodies and legs are covered in tiny hairs and slits that help distinguish different types of vibrations.

Prey burrowing into a web produces a vibrational clamor very different from another spider coming to court, or the stirring of a breeze, for example. Each strand of a canvas produces a different tonality.

A few years ago, scientists translated the three-dimensional structure of a spider’s web into music, working with artist Tomás Saraceno to create an interactive musical instrument, titled Spider web.

The team then refined and expanded on this previous work, adding an interactive virtual reality component to allow people to enter and interact with the web.

This research, the team says, will not only help them better understand the three-dimensional architecture of a spider’s web, but may even help us learn the vibrational language of spiders.

“The spider lives in an environment of vibrating strings,” MIT engineer Markus Buehler explained in 2021. “They can’t see very well, so they feel their world through vibrations, which have different frequencies.”

When we think of a spider’s web, we most likely think of the web of an orb-weaver: flat, round, with radial rays around which the spider builds a spiral net. Most spider webs, however, are not of this type, but constructed in three dimensions, such as sheet webs, tangled webs, and funnel webs, for example.

To explore the structure of these types of webs, the team housed a tropical tent-web spider (Cyrtophora citricola) in a rectangular enclosure, and waited for it to fill the space with a three-dimensional canvas. Next, they used a sheet laser to illuminate and create high-definition images of 2D cross-sections of the strip.

A specially developed algorithm then reconstructed the 3D architecture of the web from these 2D cross-sections. To turn this into music, different sound frequencies have been assigned to different strands. The notes thus generated were played according to patterns based on the structure of the web.

They also scanned a canvas as it rotated, translating each step of the process into music. This means that the notes change as the structure of the web changes and the listener can hear the process of building the web.

Having a step-by-step recording of the process also means we can better understand how spiders build a 3D web without supporting structures – a skill that could be used for 3D printing, for example.

Spider web allowed audiences to hear the spider’s music, but virtual reality, which users can step into and play strands of the web themselves, adds a whole new layer of experience, the researchers said.

“The virtual reality environment is really intriguing because your ears will pick up structural features that you might see but not immediately recognize,” Buehler explained.

“By hearing it and seeing it at the same time, you can really begin to understand the environment the spider lives in.”

This VR environment, with realistic web physics, also allows researchers to understand what happens when they disrupt certain parts of the web. Stretch a wick and its tone changes. Break one off and see how it affects the other strands around it.

This too can help us understand the architecture of a spider’s web and why they are built the way they are.

Perhaps most fascinatingly, the work allowed the team to develop an algorithm to identify the types of vibrations in a spider’s web, translating them into “prey trapped”, or “web under construction”, or ” another spider has arrived with amorous intent”.

This, the team said, is a basis for the development of learning spider speech – at least, the tropical tent-web spider.

“Now we’re trying to generate synthetic signals to basically speak the language of the spider,” Buehler said.

“If we expose them to certain patterns of rhythms or vibrations, can we affect what they do and can we start communicating with them? Those are really exciting ideas.”

The team’s previous research was published in 2018 in the Royal Society Interface Journal.

An earlier version of this article was published in April 2021.


Comments are closed.