Take your photo with style with bit.stippling, a project that stipple your photo and let you enjoy as dot-by-dot, your photo gradually appear on the screen!
We were contacted by TEDxBangkok to visualize brain activities stimulated by various smells for a collaboration between NOSEstory, scent designer; and Brain-Computer Interface Lab. We worked together with the 3 parties and concluded that this project objective is to make audience aware that smells do have an influence on the human brain. From here, they would see possibilities of how smells can be applied in various fields such as healthcare, marketing, education, art, and design.
To achieve the objective, we started by visualizing what brainwaves and smells would look like if we can see them so that audiences could see the difference between before and after a volunteer inhales a scent. We use liquid to illustrate fluidity through the movement created by the surface tensions, representing formlessness of brainwaves and smells.
Design & execution
We decided to translate brain activity into an interactive real-time data visualization installation. We picked 3 smells from 14 smells NOSEstory created for this project and used 3 milk pools as canvas, each pool for each smell. Food coloring and surfactant were used as paints, representing brainwave frequencies that relate to emotions. Blue color represents delta and theta waves, which associate with extreme relaxation and meditation. Yellow color represents alpha waves, which associate with calm and light meditation. Red color represents beta waves, reflecting alert and focused state of mind. Surfactant, which would create swirls of color as soon as it contacts the liquid canvas, represents gamma waves related to simultaneous processing of information from different brain areas, involving in learning, imagination, and memory. The results on the liquid canvas would be varied, depending on each subject’s memory and experience with the smell.
We use Cinder framework to develop our application, which would query features from Brain Features App developed by BCI Lab after a volunteer inhales each prepared smell, then calculate number of corresponding color drops and send the signal to the control circuits. The control circuits are made with 3 arduino-based control boards, one for each pool of liquid. Each board controls 4 solenoid valves connected to the liquid containers. The valves are toggled on through relays for 0.1 seconds to make each drop. The boards communicate with the PC via wired network.
Exhibition + the blind volunteer
On the workshop day at TEDx Bangkok 2017 – little things mingle, each volunteer was asked to sit down, wear 4 cup electrodes attached on the head and ears at strategic spots for the most accurate signal, inhale prepared smells, and watch as the brain processed the smell and transformed it into a visual art installation. Other workshop participants could observe the whole process and see what was happening inside the volunteer’s head. They could also try the 3 smells, which were placed on each pool, to see whether they felt the same colors as the volunteers.
We were honored by the attendance of Ploy, a talented visually-impaired writer. She has released various books and is now interested in drawing and painting. Ploy explained that because she cannot see, colors are perceived as feelings and experience. “Blue color represents sadness, and red makes me imagine the rising sun”, she further explained. We asked her to inhale a smell and translated her brainwaves into signals, which triggered the solenoid valves to release four drops of liquid. We asked her what color she perceived; “Orange, inclining to red”, she answered. The liquid that dropped into the pool that day was three drops of red and a drop of yellow.
Our interactive installation we developed in collaboration with Eyedropper Fill. Our app enables users to share their thoughts by typing into our website. The texts are then combined with other effects and displayed on the LED screen. The stylized texts can also be shared to Facebook using our app.
Through the “Life.SCB” app, we illustrate the concept of cashless society where no physical cash involves in any financial transaction. Users receive coins by joining SCB’s various activities such as applying credit cards or playing the in-app game, “Coin Casher”. The Coin Casher app is a coin collection game using Augmented Reality technology. By scanning the SCB logo located at various locations in the booth, visitors can collect virtual coins and exchange them for rewards. These coins can truly be used to purchases various products in the Expo.
Joining with Digital Venture, the bit studio created an installation to demonstrate the future of machine learning with an interesting and easy-to-understand concept, physiognomy 4.0. Visitors enjoy the art of face reading and discover their inner characters by just standing in front of the kiosk and let the machine ‘look’ at their faces. Under the hood, user’s facial keypoints are located using machine learning. These keypoints are calculated into physiognomy values following textbooks, and the interpretations are given accordingly.
The interactive installation “Cash Scanner” simulates converting physical banknotes into digital coins. With computer vision technology, the machine is able to determine the amount and type of banknotes placed on the table, and reward coins to the users accordingly through the coin reward app.