Broader Impacts Discussion Session 2 // Camera Calibration
Today
- Session 2 of the Broader Impacts Discussions
- Camera Calibration (For Your Consideration)
- Studio Time
For Next Time
- Work on the Broader Impacts assignment Part 2, due on November 5th at 7PM
- Note – the last discussion will happen on November 4th; you have been randomly assigned one of these days to lead a discussion. You may swap slots with someone on a different day, but you have to let an instructor know. Thanks!
- Work on your Machine Vision Project.
- In-class demos will be on Monday November 11th, and code/write-ups are due on Tuesday November 12th at 7PM.
- Note that prospective students will be joining us in class on the 11th!
- Consider whether there is feedback you’d like to share about the class
Broader Impacts Discussions
Today, some folks will be leading discussions on their broader impacts robots (discussion leaders noted here).
Discussants: Please choose a table / area of the room (or outside of the room!) for your discussion, and let the instructors know if there are any materials you’d like to have available for your discussion. A discussion slot will be ~25 minutes in length total; you will be signalled after 10 minutes (when you should be transitioning from presentation to discussion), and at the 5 minute warning.
Participants: At the end of the discussion slot, please fill in this reaction survey. Your responses will be available to the teaching team and to discussants afterwards.
We’ll have a brief debrief following the activity and before starting in on Studio time.
Camera Calibration
One of the essential practical aspects of machine vision is camera calibration: knowing the intrinsic and extrinsic parameters of your imaging system.
- Intrinsic Calibration refers to the sensor and lens characteristics of your imaging system; calibrating here allows you to correct for image distortions. This is a “projective transformation” between your camera coordinates and your image pixel coordinates.
- Extrinsic Calibration refers to the way in which your imaging system is set up. For instance, if you have two cameras, this would include their relative poses to one another. This is a “rigid transformation” between your world coordinates and your camera coordinates.
Pinhole Camera Model
The “pinhole camera model” is among the most commonly used for performing basic camera calibration. These slides for today go through the key details. Please use this as a resource!
Camera Calibration Resources
Here are several different tutorials on how to do (intrinsic) camera calibration in ROS2:
You can also checkout the brief OpenCV demo if you’d like to just test this out on your webcam.
Note that you’ll need checkerboards for this! Some are available in the classroom. If you’d like to print your own, calib.io pattern generator is an excellent resource!
Additional Resources
- An article on
What is the Center of an Image? -
Youtube lecture on pinhole camera model (includes a fun demo on finding the focal length of your phone camera at around 7:30)
- MatLab has a nice high-level explainer with follow-on articles worth a look.