Purdue University: 150 years of giant leaps; 1869-2019

Exhibits

Artificial Intelligence Generated Art

Bloemenveiling is a distributed app to auction artificial-intelligence-generated tulips. Echoing the auctions that sprung up throughout taverns in 17th century Holland at the height of Tulip-mania, the piece interrogates the way technology drives human desire and economic dynamics by creating artificial scarcity. Short moving image pieces of tulips created by generative adversarial networks are sold at auction using smart contracts on the Ethereum network. Each time a tulip is sold, thousands of computers around the world all work to verify the transaction, checking each other's work against each other. Even if the website for the auction shuts down, a permanent record will still exist of who owns which tulip. A network of bots participate in the auction alongside humans, driving demand in much the same way as the automated trading algorithms that drive modern financial markets. After a week, the tulip is blighted, and the moving image piece can no longer be viewed by its owner. In this way, the decay and impermanence of the natural world is reintroduced into the digital world. While the artificial intelligence that created the moving image pieces has the potential to generate infinite flowers, the enormous distributed network behind Ethereum is used, at great environmental cost, to introduce scarcity to an otherwise limitless resource.

This work will be displayed by Anna Ridler, Artist and Researcher, London.

ArtWaves

ArtWaves is an experimental visual feedback system that examines the relationship between brain signals, computing, design, visualization, and visual expression. The work translates a participant's brain waves and cerebral cortex topological levels of activity into visual forms presented on a video wall. Colors and shapes are influenced by different levels of brain wave activities. Brain signals are sampled in real time using a non-invasive electroencephalogram (EEG) headset. Signals from the EEG sensors are transmitted via bluetooth and wifi to a visual processing system that separates them into bands representing the alpha, beta, gamma, delta, and theta waves. The strengths of these signals are further processed to create the artistic display.

As the intersection of art and technology continue to evolve, the use of computer algorithms associated with brain signals will greatly impact the process of viewing, experiencing, and making art in the 21st Century. In ArtWaves, the object of contemplation, the visual form, is an active extension of the user’s brain levels of activities. This work changes the traditional approach to the experience of art in which the object of observation sits passively waiting for the viewer’s response. Here, the artwork (the observed) also functions as a tool for the participant to peek "into" his or her own mind as expressed in brain signals and their levels of activities.

ArtWaves emerged as an interdisciplinary collaboration between digital artist Petronio Bendito, Associate Professor of Visual Communication Design in the Patti and Rusty Rueff School of Design, Art and Performance, and computer scientist Tim Korb, former Assistant Head in the Department of Computer Science, both at Purdue.

Experience your falls to overcome it

Globally, falls are the major health problem. According to the World Health Organization (WHO), falls are the second largest cause of unintentional injury and death across the world. Every year 646,000 people die globally and 37.3 million people suffer from severe falls that require medical attention. So, to avoid falls there is a need for education, training, and a safer environment.

Our approach is to avoid falls by training people to experience falls in a safe manner (while being harnessed or with supports). We are using a soft elastomeric device to train people. The elastomeric device will be like a carpet, where people can stand and experience falls. The fabricated device is controlled with pressure and vacuum to mimic the different type of falls like trips and slips. The carpet comprises various multiple unit cells that can be operated individually or together in a series. The actuation is controlled by an electronic device.

This will be an interactive training unit for people to experience falls in a safe manner. It will illustrate how future training devices might prevent falls by increasing our capabilities of improving balance and gait.

This will be displayed by Murali Kannan Maruthamutu, Postdoctoral Research Assistant, Purdue.

Experience your heartbeat

Heart is an essential and complex organ. Heart disease is the leading cause of death among men and women in the US. One in four death in the US occur due to heart-related diseases. Coronary heart disease, arrhythmia, and myocardial infarction are some examples of heart-related diseases. Arrhythmia is a condition where irregular heartbeat occurs either by happening too slow or too fast. When the heartbeat gets faster, the condition is called tachycardia, and a slower heartbeat condition is called bradycardia. To study these conditions, the development of an artificial device that mimics the properties of the heart muscle is important.

Our approach is to build a physical model of the heart that can beat in sync with the user’s heart. We will print a mold using a 3D printer with anatomical resemblance. Using this negative mold, we will fabricate a heart-like a device with a soft elastomer (Ecoflex 00-30). Ecoflex is an elastomeric material that has mechanical properties similar to the heart muscle. The actuation of the heart will be controlled with our electronic device to mimic the tachycardia and bradycardia conditions. The users will wear sensors that measure heart rate. The physical heart model will respond to the user’s heartbeat.

This will be an interactive model for people/participants to see and feel a physical model of their heart. It will respond to their own heartbeat and sitting vs. standing or exercising would show a change in heartbeat.

This will be displayed by Murali Kannan Maruthamutu, Postdoctoral Research Assistant, Purdue.

Intersection of Art and Science Display

This presentation examines a wide range of expressive approaches that emerge at the intersection of art, science, and technology. It is a joint collaboration between the Department of Computer Science and the Patti and Rusty Rueff School of Design, Art and Performance at Purdue University. The display is an extension of an existing exhibition currently on view on the 3rd floor of the Lawson Computer Science Building. It features 15 international artists, working independently or collaboratively, who applied scientific processes to the creation of visual expressions and related artifacts. The artists explored a wide range of topics from the perspectives of both science and the humanities, ranging from recursion and quantum computing to music and movement.

The exhibition aims to engage students, faculty, and visitors, to inspire collaborations and interdisciplinary thinking, and to educate about the increasing role that computational and visual processes play in the creation of real and virtual experiences. The works exhibited characterize new possibilities of human experiences and expressions—from purely aesthetic enjoyment to the development of deep and sophisticated concepts.

Participants of the Purdue 2050: Conference of the Future are invited to further explore the works by visiting the exhibition at the Lawson Computer Science Building.

The exhibition was curated and is presented by Petronio Bendito, Associate Professor of Visual Communication Design in the Patti and Rusty Rueff School of Design, Art and Performance, and Tim Korb, computer scientist and former Assistant Head in the Department of Computer Science, both at Purdue.

Music Vision

Chinese-style electronic music visualization. The purpose of the projects was to explore possibilities between graphic design, dynamic graphics, and music. Designed and conceived by Visual Communication Design graduate student Fanfei Zhou, VCD MFA, 2019. This work will be presented by Li Zhang, Professor of Visual Communication Design, Purdue.

Plasmonic Kaleidoscope

The proposed object is a display that fuses scientific research with art. Groundbreaking research in nanotechnology demonstrates the variable nature of structural color. In a prior experiment by students and Faculty at the Brick Nanotechnology center, it was demonstrated that the appearance of color of an optimized nanosurface could change depending on variations of angle and polarization of incident light. The sculpture is proposed to be an interactive object that allows the general public to experience the relativity of color experienced at the research laboratory. Visitors will be able to be immersed in a physical kaleidoscope that they can control by using two rotating wheels that resemble the variation of polarization. The physical kaleidoscope is made out of acrylic mirrors and wood (and other materials). The Kaleidoscope idea was originated as an application of the research by doctoral student Maowen Song and Professor Alexandra Boltasseva. The kaleidoscope inspired us to take this idea further by creating an engaging experience for the Greater Lafayette and Purdue Communities.

This will be displayed by Esteban Garcia, Assistant Professor of Computer Graphics at Purdue in collaboration with Maxwell Carlson, Davin Huston and Alex Whaley (Carlson Garcia Collective).

Plasmonic Mandalas

We are designing new light projections using tiny mirrors covered with randomized silver particles. Our work in progress shows the first test of silver particles being arranged to reflect designed color (In this case blue and yellow). Through an optic arrangement we project reflections at a bigger scale - about 24” x 24” – For this project we have to create an optic device that projects the color reflection. This project is in collaboration with Professor Alexandra Boltasseva, Post-Doctoral Researcher Piotr Nyga and graduate student Sarah Nahar Chowdhury (all from Purdue). We chose the theme of the mandala because they are symbolic representations of the self, while also a reflection of the universe.

This work will be displayed by Esteban Garcia, Assistant Professor of Computer Graphics, Purdue.

Digital Borneo: Climb to the Top of the Rainforest in Borneo

This experience immerses participants in the unique sonic voice of an ancient forest in Borneo, Brunei collected by scientists at the Discovery Park Center for Global Soundscapes. This mega diverse place will be presented through an innovative network of transparent piezoelectric audio speakers developed at the Birck Nanotechnology Center by Mukerrem Cakmak, Professor of Materials and Mechanical Engineering and Armen Yildrim, Ph.D. student. The audio speakers were created at Purdue through the installation of a roll to roll manufacturing line that produces the low cost transparent piezoelectric films that can also be used as ultra-sensitive force sensors. The thin transparent audio speakers are sized according to species and their frequencies regenerating one of the oldest terrestrial soundscapes on Earth which is estimated to be over 100 million years old. Participants will “climb” to the top of the canopy where towering trees rise to 90 meters. The Center for Global Soundscapes has been studying biodiversity trends at pristine and disturbed biomes around the globe through the use of its big data audio recordings probed using artificial intelligence tools. These recordings will serve, in part, as acoustic fossils of natural habitats being lost worldwide, some of which will have vanished by 2050. This exhibit was designed by Bryan C. Pijanowski, Professor and University Faculty Scholar, Department of Forestry and Natural Resources, and Kristen Bellisario, Postdoctoral Research Associate, and sound and light installation support by Davin Huston, Assistant Professor in the School of Engineering Technology.

LHP's Autonomous RC Development Car

Built completely from scratch, it started out life as a Traxxas hobby-grade RC car, of which LHP also uses in several of in its STEM and training programs, including a new Autonomous Vehicle Technology course in its LHPu division. Now run with an NVIDIA Xavier, it is designed to serve as a testing and development platform for years to come. While it is currently running LIDAR and a depth-sensing camera for navigation, it is designed to integrate other technologies that LHP is also helping to pioneer, notably LiFi and 5G for V2X communication capabilities, which are believed to be instrumental in transportation automation in the long term. This car is also currently serving as a test vehicle for a master’s thesis in predictive maintenance and machine learning by a student out of Italy at the University of Bologna, in partnership with LHP Europe. For this demonstration, we will have various objects to place in front of the vehicle path, demonstrating the ability to safely, stop, correct, and continue on.

User Experience Design

To build "AR (Augmented Reality) commerce" experience by breaking paper menu barriers (with AR Portion, real-time translation, allergy alarm, online ingredient, etc). Designed and conceived by Visual Communication Design graduate student Fanfei Zhou, VCD MFA, 2019 This will be displayed by Li Zhang, Professor of Visual Communication Design, Purdue.

Purdue University, 610 Purdue Mall, West Lafayette, IN, 47907, 765-494-4600

© 2019 Purdue University | An equal access/equal opportunity university | Integrity Statement | Copyright Complaints | Discovery Park

Contact Discovery Park for accessibility issues with this page | Accessibility Resources | Contact Us