Current Projects
Click to view


Targeting Computer

  • Project Lead

    Ezra "Zelda"
  • Subteams required

  • ~Project Finally Completed~

  • Description

    Using a Pixy camera and integrated ultrasonic sensor, we have finally created a targeting system for our robot and future robots. This is a HUGE accomplishment for the team, as the attempt to make a functional targeting system has been an endeavor of ours over the past eight years. As our mentor David said, "This is our White Whale..." And he is right, this, WAS, our White Whale. I am honored to be the programmer to have worked on this project, and I am happy Scott and I could accomplish this feat. I dedicate our work to our Coach Davis for his timeless hours spent working to make this team succeed, and for never running out of faith in me. Thank you, and I hope that my work will lift the team to places you could have only dreamed about once I have graduated. Special thanks to Scott, my mentor, for staying with me and being a role model I can look up to.

    -Ezra (Head Programmer)
  • Design Requirements
    Must Have:

    Processed vision data from the Pixy
    Ability to locate reflective vision targets of assorted shapes
    Range detection either from the camera or ultrasonic sensor
    Move and turn the robot drive train to point at target
    Integration with I2C Bus and the Pixy
  • Design Requirements

    Build or modify a turret that can rotate and elevate
    Code that actuates the turret according to targeting data.
    Mount for Pixy camera that provides optimal ventilation and security.
  • Extra Information

    -When initially designing this auto-targeting system, we looked to use a LIDAR Lite v1 to process our distance. The LIDAR is a powerful laser range finder, and something we will be looking to impliment after some handy electrical work (if integration is successful, we will add all changes to the GIT page for this source code). Unfortunately, with respect to our six week build time restraints, the LIDAR proved to be incompatable with the roboRIO itself. This was due to the clock speed of the roboRIO (4000hz) being much to fast for the LIDAR (1000hz). This was an unfortunate oversight of our department. For now, an ultrasonic range finder will be sufficient and will be included in the documentation for this targeting / shooting system until further development of the LIDAR integration proves to be successful.
  • Source Code

    Coming soon with detailed instructions on it's use.

    The GIT page for our entire code for this year can be found here at our BitBucket overview of the source code. Just go down to the download section (a cloud icon) and download the repository. Here you can look at the various classes we use with the Pixy / Lidar. After season, we will create a new repository that includes just the Pixy and LIDAR code, as well as a base code with the Pixy already implimented to build code off of.