The main goal of the project was to create a product able to predict and display the chords that accompany a song. To do this, a user selects a song from YouTube and inputs the link for our software program. From there, we use signal processing techniques to find the chords and beat of the music. Once that is done, the song is played and the chords are displayed with an LED matrix in the style of “Guitar Hero,” where the user can play along to the selected song on a real keyboard.
Composting is a method of recycling food scraps, yard trimmings, and plant waste into nutrient rich soil through controlled decomposition. This has the benefit of diverting trash from landfills, sequestering methane, and building soil health. However, composting waste requires very specific temperature and moisture conditions. Compost-O-Matic is an Internet-connected compost monitoring system that uses smart sensing and analytics to take the guesswork out of maintaining a compost pile. The system takes automated temperature and moisture readings from a compost bin and shares them with the user by a web interface to help inform management of the compost pile.
This project is inspired by the interdisciplinary fields of biology and engineering by harnessing the electrical impulses of the body to control a robotic arm. The signals are captured using several Surface Electromyography (sEMG) sensors that are placed on the forearm to capture the action potentials that occur when a user contracts their arm. Through the use of machine learning, our project interprets the data from the sEMG sensors into recognizable gestures that can be modeled using a robotic hand controlled by servo motors.
It’s Coming Home is a robot that can swing a leg and kick a soccer ball into a desired location in the goal from the penalty spot. The robot has a camera on it which takes a picture of the goal, and through some user interface, the user will be able to select the location at which the robot will shoot. Then, the robot will make the necessary mechanical adjustments, and will shoot the ball.
Kooka is a cooking assistant robot arm. Currently, it is able to move a ladle in a stirring motion, which is useful for unclumping pasta while it is cooking. Potential future goals include enabling chopping fruits and vegetables as well as assembling ingredients.
Our project is a 3D printed quadruped robot that uses solenoids and ball joints to walk while avoiding obstacles and staying upright. Rather than using advanced software, our goal was to make the robot capable of walking by utilizing the mechanical properties of our components. The solenoids lift the bottom of the robot’s legs up rather than rotating them, allowing it to avoid low obstacles on the ground. The ball joints are attached to the feet to prevent the robot from tipping when off balanced. The robot can be controlled from a phone app through a Bluetooth connection and allows a user to move it forwards and backwards.
Inspired by STEEZY Studios’ “Reverse Choreography” series, this project captures user movement/dance and generates a Spotify playlist that fits the energy of the movement. The user first moves in front of a capture camera to a selected tempo. After analyzing movement data and prompting the user about general feeling, the playlist is automatically generated and queued on Spotify. During the process, visual effects also react off of movement and song audio once the playlist is generated.
Robot Rock is an automated drumming robot. Robot Rock can lay down a predefined beat or improvise one with a snare, tom, kick drum, cymbal, and maracas. For visual flare, a llama-corn mascot sits atop the drum throne and shakes its head while LED strips flare to the beat.
We designed a robot capable of printing custom pancake designs. Users can simply upload an SVG of an image or a hand-drawn design to our web interface, and watch the robot make a precise pancake to your desired shape.
Our main goal is to build Hyperactive Noise Canceling (HNC) headphones, our own version of Active Noise Canceling (ANC) headphones. In general, ANC headphones use microphones to analyze and digitally process noise outside the headphones, then play anti-noise to cancel out the background noise. At the most basic level, this involves figuring out the frequencies, amplitudes, and phases of the background noise. Our HNC technology can be implemented on any pair of headphones and will be able to selectively cancel out sine waves of different frequencies.
Our main goal this semester was to build a detachable skateboard safety module that amplifies not only the user’s spatial awareness, but also alerts others of the user’s movements and actions. Here at USC, skateboarding is very popular among students, with people riding throughout campus and on the streets of South Central LA. Almost everyone here knows of students that have gotten into skateboarding accidents, which can be very dangerous. Our system increases the safety of skateboarding by adding rear/blind spot detection and automatic turning blinkers to not only alert the user of their surroundings, but also alert cars and pedestrians surrounding them.
As one of the two teams on the Frontlinez project, our goal was to build a Snake Game-inspired robot that drags around an onboard marker to draw the longest line possible on a paper arena without crossing the line drawn by the opponent’s robot. After the robots were built, the two teams would then participate in a Frontlinez competition during the Makers Fall Showcase.
This project was a competitive drawing robot between the other Frontlinez team (GND). The goal of the competition was to draw the longest uninterrupted line possible without crossing over the opponents line or leaving the boundaries.
The purpose of this project is to build a robot which is capable of locating different people at a dinner table and passing butter to them when commanded. This is an idea made popular by a scene from Ricky and Morty. It combines knowledge of mechanical, electrical, software, and robot design.
Smart Glasses are a pair of glasses that can do smart things, including voice recognition, image recognition, and text to speech. The system includes an iOS app, ESP-32 CAM microcontroller, and a Node.js server. BLE connects the iOS app to the ESP-32 CAM and regular HTTPS is used to connect the app to the server. A speaker on the microcontroller streams microphone data to the app for voice recognition. Then, the recognized text is sent to the server where it is proxied to DialogFlow to handle bot responses. The bot response is parsed by the app which commands the ESP-32 to take a picture or turn on a light.
Based on the Boston Dynamics dog, RoboDog took the SpotMicro Dog and wanted to make this dog more aesthetic friendly. The goal of the project was to show that Robotic animals can be aesthetically pleasing as well as cool. This was done by attempting to attach a tail and fixing the head of the dog.
One of the biggest parts of this project was learning how to use the Robot Operating System and it also required a lot of CAD knowledge. This project is purely mechanical, and took a lot of time to do because we were still learning how to 3D print things.
This will not be the last time this dog will be seen as the goal is to still have it more functional, during the time of showcase one of the big issues was learning how servos worked. As it turned out the servos were not calibrated properly so the goal is to fix the calibrations so we can have a walking dog.
Zoomba is a telepresence robot that a user anywhere in the world can operate over the Internet. The user can drive Zoomba as well as see, hear, be seen, and be heard by people on the robot's side.
teenage Jeanette makes a garden. is an outdoor, interactive deep water culture (DWC) hydroponics system that plays synthesizer sounds and lights up depending on any one person's proximity to it. This project was installed in the parking lot of the USC Graduate Fine Arts Building (IFT) on Flower St. and is planned to be reinstalled in the E-Quad of the USC Viterbi School of Engineering.
teen Jean was inspired by artists Félix Gonzáles-Torres, Andy Goldsworthy, Olafur Eliasson, Noah Purifoy, and the music of Emily A. Sprague. It was made to commemorate the lives lost due to the pandemic and to showcase human resilience and the experience of loss and love that ultimately connects us to something higher than ourselves.
Dude Where's My Bike is an LTE IoT security system aimed to prevent bike theft on campus. It has motion sensors and a gps module so the bikes position will always be known by the user via a mobile app. The backend uses a http/mqtt bridge which enables smooth and streamlined communication between the microcontroller on the bike and the native mobile app on the phone. With Dude Where's My Bike your bike is always safe!
Portal gun’s goal was to create a replica of the portal gun pictured in Rick and Morty. The user enters a location on an intuitive dial UI and then presses the trigger button. The raspberry pi inside then makes a request for google earth images of that location and stitches them together. Finally, it projects the final picture with the addition of a spinning portal behind it through an hdmi connection to the mini projector inside. This involved use of the google earth api, image stitching, opencv, soldering components together, and designing and 3d printing a shell.
The goal of our project was to create a real-life version of Wizard's Chess from Harry Potter, in which chess pieces move autonomously across a board as dictated by player voice commands. Players make moves by interacting with a web application built with React that contains voice recognition logic and a basic UI displaying the game board. These moves are then sent through HTTP requests to a Python Flask server running on a Raspberry Pi, which processes the moves and controls the stepper motors and electromagnets that carry the pieces across the physical board. Each chess piece contains a magnet sitting at its bottom that is attracted by the electromagnet moving under the board. The pieces also break open when captured, achieved by reversing the polarity of the electromagnet to flip the magnet within the piece above and trigger the sides of the price to fall open. This mechanism was created by 3D printing each piece to have 2 mirrored sides held together by hinges.
Team Heat Press created a fully functional airbag heat press to print custom t-shirts. The electrical team focused on a bluetooth controlled heating pad, and the mechanical team created the heat press frame and airbag. The team had so much fun learning how to use power tools together!
Dealer No Deal is a BlackJack card dealing robot that makes the game a little more interesting if you know what you're looking for. By knowing the secret code of button inputs, the hidden camera in the robot looks at the next card about to be played and lets the player know whether they should Hit or Stand. The goal of this project is to help me make some money by scamming the rest of Makers :P and never lose at BlackJack. The robot is made of custom 3d printed parts and a dual-roller system to ensure that cards are accurately dealt. The software uses a python script for the BlackJack game logic as well as OpenCV on a pi camera for the card vision system.
MASK OFF is a smart mask detecting system that keeps the anti-maskers away! Our device uses computer vision to detect a mask on someone's face, and consequently unlocks or locks the door to your establishment. It works on everyone, from your neighborhood Karens to your neighborhood dogs. MASK OFF implements a face mask detection model that prevents non-maskers from entering a unit. All a user has to do is press a button, and based on whether our camera detects a mask or not, a signal is sent to a raspberry pi that unlocks a door. We initially implemented an open source neural network model using Keras and Tensorflow. Then, we integrated the model with the raspberry pi in order to have a button set off the unlocking signal.
SpacePainter is a multi-axis CNC light painting robot. Essentially, it is a robotic arm that moves an LED in programmed patterns. When photographed by a long-exposure camera, the robot's path gets traced out in space by the light. It can be used to create cool effects in photos or as a multimedia installation piece. It is built with custom 3D printed parts and is controlled by an Arduino microcontroller. It uses two stepper motors to move and an Adafruit Neopixel as an LED. Over the course of this semester, we have designed, built, and programmed the entire robot from scratch. The robot can paint five different light paintings and we are working on adding more!
Smart Lock is an IoT lock that can be controlled from your phone. Our front end is connected to a server that communicates with an Arduino. When the Arduino receives a message, it opens or closes the lock depending on the message received. We designed a custom lock using CAD that can be attached to a deadbolt, and programmed the embedded system code as well as the communications and backend of the project.
The goal of our project was to make a tic-tac-toe board that plays the game for you! You can make moves using voice commands, and X and O pieces will move across the board to the correct locations on the grid. You can also either play against a friend, or against an AI at three different difficulty levels. To implement speech recognition, we used a React web app and React API that listens to your voice through any laptop microphone. The moves you make are then sent back to our Python back-end which utilizes the Flask web framework. Here, our AI algorithm determines what moves to make in response, and our motor control code moves stepper motors to place pieces on the physical grid. The board itself is a 2 axis motor stage with linear slides, pulleys, and 3D-printed mounts. An electromagnet that is moved around by the motors turns on and off to pull magnetic pieces across the top of the board.
Covid Simulation is a guided build that aims to simulate the spread of covid using a Respiratory Exchange Model. Using math and physics models, we generated equations of trajectories of particles and later, total exposure to airborne covid particles. We used a lot of python programming, including libraries such as matplotlib and numpy, and also gained an understanding of the physical phenomena that are at play when quantifying the spread of a particle through the air. This guided build had an emphasis on Python programming skills and was headed by Radhika Bhuckory.
This guided build used Autodesk Fusion 360 CAD software to design a custom catapult. We created complex geometric shapes to form the different parts of the catapult, including the base, the arches, and the arm. We placed an emphasis on 3D design skills and 3D printing to give students a good understanding of how to plan, design, and implement CAD models. Headed by Ashwan Kadam.
In this very applicable guided build project, we learned how to hack a TI84 graphing calculator so it has internet communciation capabilities. There was an emphasis on Arduino programming skills (specifically, using an ESP8266) and learning how to navigate the treacherous landscape of TI Basic coding. The PM for IOT TI84 was Devin Mui.
The goal of this guided build is learning how to interface an Arduino with other sensors, and using Processing IDE to visualize data. An ultrasonic sensor was used to gather data for the sonar system, and rotated using a servo to gather data around the periphery of the sensor. We also used serial protocols to communicate between the Arduino and the computer, which then visualizes the data using Processing. Headed by PM Efaz Muhaimen.
The goal of this project is to use bevel gears to create a cool polygonal shape. When you rotate one gear, it distorts the shape, but as you continue to rotate it gradually returns to its original form. This GB not only taught members to CAD model individual parts, but also connect them to form a functioning composite design. As an added bonus, members created their own custom Adobe Illustrator designs and imported them onto their CAD models! Led by PM Ashwan Kadam.
As it name may suggest, this GB taught users how create PCB schematics, and then place them onto the board using PCB design software. In doing this, members not only learned how to use Autodesk Eagle PCB design software, but also how to program their LED matrix microcontroller with an Arduino and solder parts onto their PCB board. PMed by Efaz Muhaimen.
This GB introduced some crazy breadboarding skills, implementing resistor ladders, OP AMPS, and more into their design. In addition to this, they used Arduino programming to generate the various waveforms required to produce musical sounds and play them through their circuit. This GB had an emphasis on arduino programming and breadboarding, and was led by PM Devin Mui.
We are building a robotic arm that is controlled by a sleeve that is linked to the arm through Bluetooth. The robot then uses information from the sleeve to mimic the user's motions. To build this project, we used CAD skills to model the arm, electrical engineering skills to wire the robots, and embedded programming & robotics skills to get the arm and sensors working.
Forget Me Not is a project geared to making sure you never forget your valuables at home ever again. Our project involves a central Arduino that is located in your home. It uses Bluetooth Low Energy to communicate with peripheral devices located on your wallet, keys and any other essentials you don't want to leave behind. If the system detects you leaving the house without any of these, it will buzz, reminding you that you left something behind.
Robonaldo is a robot that is designed to autonomously drive while avoid obstacles, track a soccer ball, and shoot on a goal. Our project uses Robot Operating System (ROS), and an Nvidia Jetson as the brains of the robot. In the future we are planning to use reinforcement learning to teach the bot to drive, and install an appendage that allows Robonaldo to pull in and shoot a soccer ball. Other aspects of engineering that have been used in Robonaldo include computer vision, Arduino programming, Gazebo simulations, and CAD design.
The goal of our project is to create a guitar that can emulate the sounds of other instruments when it is strummed. Our project consists of four phases: CAD, Manufacturing, Electrical Design, and Programming. The technical skills we needed included CAD software proficiency, electrical/wiring experience, programming experience for arduino, MIDI knowledge, and manufacturing skills.
The goal of our project was to build a robot from scratch and then add the functionality to follow people around and play music. It consisted of three main parts: (1) Physically putting together and wiring the robot, (2) Developing the software so that it could follow a single person, and (3) Adding in the "extras" like playing music from a bluetooth speaker and designing an enclosure for the robot. DJ Roomba required a variety of technical skills, covering a lot from both hardware and software. Our work mainly fit into either embedded development, computer vision, or the general, overarching category of robotics.
We wanted to develop a pen that would turn hand written text into a digital document. We believe that this could help both in classes that do not allow laptops and for people who prefer to take notes by hand. We wanted to create a machine learning algorithm to be able to recognize what is being written and later upload it to a computer to be digitized.
The goal of this project is to make our own Flip-Disc display entirely from scratch. In order to be able to make the largest possible display with our budget, we have spent a lot of time designing a pixel that is easy to make, cheap to produce, and made with recycled materials. Once the display is built we will display it in the Ming Hsieh Institute Lounge -- we arer still deciding what to show on screen.
The goal of our project was to create an autonomous spider robot that is capable of traversing different terrains and climb over obstacles. The robot is 3d-printed and consists of 8 legs with 3 servo motors each connected to a BeagleBone Blue. The spider can be controlled either through a direct connection to the BeagleBone Blue or wirelessly with a controller. An Nvidia Jetson Nano with an attached camera adds computer vison capability allowing the robot to move autonomously. The technical skills involved in the project include coding in Python, integrating computer vision capabilities, programming a microcontroller to control motors, adding Bluetooth communication, and creating CAD models.
Our team built a "magic mirror" which is essentially a smart mirror IoT device that is able to access a myriad of helpful information from basic things like the weather, time, and date, to reminders from Google Calendar, a Spotify player, and the top social media and news headlines. It has the unique feature of complimenting you when you get closer to it (with the help of an ultrasonic sensor). We configured the brain of the mirror (the raspberry pi) with modules from an open source framework and modified them for our mirror. We also built our frame from scratch. Our team got to explore all types of fun technologies and modules through building this mirror, and had the best time putting it together!
Our project was building an IoT Boba Machine which autonomously makes our favorite drink: boba tea. It brews tea in hot water and adds milk, tapioca pearls, and honey. We built and designed the physical machine from the frame and various apparatuses to brew the tea and put in the necessary ingredients to make delicious boba. We also connected the machine to a phone app where you tell it when to start and how to adjust sweetness. Overall, we're a really chill team who likes to have fun but also get work done. This project involves many different components - mobile app design, CAD, working with motors/sensors, IoT integration, working with tools (i.e. drill, band saw, hammer), and is very hands on.
Originally inspired by the CHARIOT project headed by Professor Krishnamachari, HADES aims to create wearable attention trackers for use in the classroom. This allows a teacher to track how many of her students are actually learning the material and allows for her to adjust mid-lesson to maximize student potential. To create the wearable tracker, we used currently existing wearable technology combined with biosensors to track physical symptoms of decreased attention span, including heart rate, body temperature, and galvanic skin response. The sensors, combined with a wifi-enabled microcontroller, are packaged into one single glove, allowing for the child’s ease of wear and durability.
Micromouse is an existing international competition in which teams build robots that can autonomously navigate mazes and find the fastest path through said maze. Our project was to build a robot capable of performing in a Micromouse event, and then improve that as time goes on to make ourselves more competitive and well-known within the scene. Alongside this, we built a simulator for different maze discovering and solving algorithms, which we used to test our algorithms while our robot was being built.
Robosketch is a drawing robot that uses pens to draw on 8.5x11" paper. It can draw arbitrary shapes and scaled vector graphics.
The Smart Rebounder is a basketball hoop attachment aimed at improving athlete shot training. It attaches to the rim of the basket and acts as a slide to funnel make baskets back to the shooter. It uses OpenCV and servo motors to track the shooter’s position and rotate the attachment to point in the direction of the athlete in order to funnel their made baskets back in their direction.
The goal of this project was to build a professional-grade home speaker system that leverages IoT technologies to provide audio to different areas of the house. For our project we have built a unit that will receive music streams from multiple devices and output them to the appropriate series of speakers.
We have constructed a musical instrument that relies solely on analog electronics and an interesting EM phenomenon. We've breadboarded a circuit based on a guide we found online, and have put it on a Printed Circuit Board (PCB).