Robotics

The Robotics project began in the spring of 2019. Its goal is to design a device that will help the students learn how robotics are used in industry.

The requirements from the project partner include: 

  • Robot must be made for a class setting and fit into its designated storage space 

  • Must be safe for students 

  • Must be interactive 

In the fall of 2019 the project reached the detailed design phase with the team beginning coding on their initial design.

In the fall of 2019, the robotics team had a design for a pick and drop robot and had began the programming on it. They did not have it completed and the new team did not know how to continue it so they started new and began coming up with new ideas for the robot.

They came up with multiple designs to receive feedback on and had it narrowed down to two ideas. 

Light Following Robot

The first proposal was for a maze following robot. This design would have sensors around its body to detect barriers, and students would be able to design and build a maze for the robot to go through. There would also be opportunities in the future to add more implementations to the robot for further student interaction. 

Maze Following Robot

The second proposal was for a light following robot. For this product, students would be able to control a light source that the robot would be able to detect with the sensors surrounding its body. Like the other design, there would be opportunities in the future to add more implementations to the robot to increase student interaction.

With feedback from the project partner, they began working on the light following robot where the students can control where the light is coming from and the robot will follow it. 

They have determined how they want to make a shell to cover the wires of the robot and have ordered the light sensors they plan to use.

They plan to continue coding the robot and have it put together to deliver. Their final deliverables are going to be the robot fully coded with Arduino.

Fall 2020:

Last semester, the robotics project team initiated the basic idea of a light-following robot, which would find a single light source and move toward it. The team for the fall 2020 semester found the idea to be simplistic, and noticed many areas where the design could potentially be innovated upon. So, the team decided they would keep the basic ideas and expand upon them this semester. The team is continuing to build on the idea of a robot programmed to follow sequences of light sources. They will place three light sensors around the exterior of the robot: one in the front, one of the right side, and one on the left side. The team decided it would be unnecessary to place one on the backside of the robot because it won't utilize any backwards mobility; however, despite this, the robot will still be able to detect light exposure from almost any angle. 

The new team plans to deliver a soft prototype of the robot the project partner by the end of this semester. So far, they have made great progress. Currently, they are working on creating the light sensor codes and testing the sensor values, as well as working to write basic motor functions.

 

 

 

 

 

 

 

 

 

 The design was expanded upon with the addition of a 3D-printed CAD dome built to cover the Arduino and wiring, and overall promote safety for the student-users. We also built individual light sources for the project using our teams mechanical skills and purchased materials. 
 

Our final semester deliverables consist of Charlie the Robot and its corresponding light sources, pictured below. The robot utilizes a user driver calibration process primarily built to enhance the interactivity of the project in order to begin its light source following process. This robot can follow a single light source and stop once its within an approximate range from the light source. The robot can also pause and recalibrate, and then begin its path to the next light, without the user having to completely turn off the robot and re-start the process. Although we plan to implement greater interactivity and mobility, this prototype is being delivered to Mrs. Marvin for testing and evaluation until we can continue expanding on the design in the following semester. 

_MG_9131.JPG
_MG_9172.JPG
_MG_9210.JPG
PXL_20201015_210839061.jpg
IMG_1688.HEIC
IMG_1687.HEIC

Spring 2021 

This semester, DCES Robotics plans to expand on our project design from the previous semester. The team, consisting of the same individuals from Fall 2020, has created goals to implement the following new features: sound effects, object detection, an interactive application, and physical robot exterior improvements to give off a more child-friendly aesthetic. 

Sound effects: The software sub-team will be working on implementing sound effects onto the robots Arduino Uno. The ideal goal is that once the robot reaches its desired light source, the robot will emit a sound effect through a song coded individually note-by-note by the team members to indicate the success of having reached the light. An idea the team had to expand on this idea could be to create a voice recording that the robot could play once the robot reached the light to add a more personable feature. If time and resources permit, this would be the next goal of the team for hopefully this semester, but if not then the Fall of 2021 Robotics team could take over. 

Object Detection: This feature is another semester task for the software sub-team. The team wants to use ultrasonic sensors to assist the robot in detecting objects or obstacles in the way between itself and the desired light source. Not only would this be a protective feature for the robot and keeping it in tact, but it also serves as a safety measure for the students so they don't have to worry about getting in the way of the robot or potentially stepping on it. 

Physical improvements: The physical features of the robot delivered last semester were low quality. The wheels on the robot lacked traction that would allow the motors to turn the wheel consistently. The shell was designed strictly for function, and even then, it lacked basic features. This semester, we look to improve the design and function of the wheel, as well as remodel the CAD shell to meet industry standards. The new designs will be more visually appealing and cater to the inner features of the vehicle.

Interactive Application: Student interactivity is an important aspect of this project that is largely unaddressed with the Sequential Light Following Robot from last semester. Students are limited by only being able to control the placement of light sources. To amend this, our proposed senior design project is an application that will allow students to program the robot on their own. The interface will resemble Scratch (https://scratch.mit.edu/) and use block programming. Scratch, or more broadly block programming, seeks to make programming more accessible for individuals who are not yet technically proficient in writing software. It removes the necessity of understanding language syntax and instead uses functional blocks that the user arranges together to perform a function. Within our custom application, each block will correspond to real-world functions on the robot and have parameters that the students can set. This will provide students with full control of how the robot performs a desired function, like following a path of lights. Students would gain the experience of using sensors and software to develop an algorithm that results in an intelligent vehicle, just as seen within industry. In general, this aspect of the project aims to create an application used to program a robot that can grow and expand with the student. It contains a variety of different commands for the robot, and the commands range from low level blocks typically found in programming languages to high level commands to allow the students to have the robot perform interesting actions even when they’re at the beginning of their programming journey.
 

Spring 2021 Semester Final Deliverable: Recap and Progress

This semester, the robotics team was incredibly successful in achieving implementation of all of the desired features listed above and as follows: sound effects, object detection, physical improvements, and an interactive application. The sound effects feature ended up consisting of a note sequence, that would play when the robot reaches its desired light source or whenever the student would like using the application. The object detection is consistent with what the team had planned it to be; the robot will be able to detect when there is an object exempt from the light source in its line of path, and stop when it is within approximately 6 inches from that object. For physical improvements, an entirely new shell was designed. It provided extra space for wiring and included built-in slots for the buttons, the screen, the photoresistors, and the speakers. The ultrasonic sensor was placed at the front of the robot, mimicking what a cars headlights would look like. In addition, the new shell was painted with dark blue paint and white stars and moons to give off a more child-friendly aesthetic (see photo). Finally, the interactive application was built and made entirely compatible with the device. The students will be able to program the robot to engage in certain movements, such as stopping or turning, and they can even write out a message for the robot to display on its screen. The final prototype will be delivered to Mrs. Marvin at the Delphi Community Elementary School to be used in her 5th grade classroom. 

robot image .jpg