[Udacity] Become a Sensor Fusion Engineer
Nanodegree Program–nd313
Become a Sensor Fusion Engineer
Estimated Time
4 Months
At 10 hours/week
Enroll by
Get access to classroom immediately on enrollment
Prerequisites
Intermediate C++, Calculus, and Probability
See prerequisites in detail
Built in Collaboration With
Why take this Nanodegree program
The Sensor Fusion Engineer Nanodegree program will teach you the skills that most engineers learn on-the-job or in a graduate program – how to fuse data from multiple sensors to track non-linear motion and objects in the environment. Apply the skills you learn in this program to a career in robotics, self-driving cars, and much more.
What You Will Learn
Syllabus
Sensor Fusion Engineer
Learn to detect obstacles in lidar point clouds through clustering and segmentation, apply thresholds and filters to radar data in order to accurately track objects, and augment your perception by projecting camera images into three dimensions and fusing these projections with other sensor data. Combine this sensor data with Kalman filters to perceive the world around a vehicle and track objects over time.
Learn to fuse data from three of the primary sensors that robots use: lidar, camera, and radar.
Hide details
Estimated 4 Months
Prerequisite Knowledge
You should have intermediate C++ knowledge, and be familiar with calculus, probability, and linear algebra. See detailed requirements.
All Our Programs Include
Real-world projects from industry experts
Technical mentor support
Personal career coach and career services
Flexible learning program
Technical mentor support
New
Student community
Improved
Personal career coaching
New
- Personalized feedback
- Unlimited submissions and feedback loops
- Practical tips and industry best practices
- Additional suggested resources to improve
Learn with the best
David Silver
Head of Curriculum
David Silver leads the Udacity Curriculum Team. Before Udacity, David was a research engineer on the autonomous vehicle team at Ford. He has an MBA from Stanford, and a BSE in Computer Science from Princeton.
Stephen Welch
Instructor
Stephen is a Content Developer at Udacity and has worked on the C++ and Self-Driving Car Engineer Nanodegree programs. He started teaching and coding while completing a Ph.D. in mathematics, and has been passionate about engineering education ever since.
Andreas Haja
Instructor
Andreas Haja is an engineer, educator and autonomous vehicle enthusiast with a PhD in computer science. Andreas now works as a professor, where he focuses on project-based learning in engineering. During his career with Volkswagen and Bosch he developed camera technology and autonomous vehicle prototypes.
Abdullah Zaidi
Instructor
Abdullah holds his M.S from the University of Maryland and is an expert in the field of Radio Frequency Design and Digital Signal processing. After spending several years at Qualcomm, Abdullah joined Metawave, where he now leads Radar development for autonomous driving.
Aaron Brown
Instructor
Aaron Brown has a background in electrical engineering, robotics and deep learning. Aaron has worked as a Content Developer and Simulation Engineer at Udacity focusing on developing projects for self-driving cars.
Program Details
Program Overview – Why should I take this program?
Why should I enroll?
Sensor fusion engineering is one of the most important and exciting areas of robotics.
Sensors like cameras, radar, and lidar help self-driving cars, drones, and all types of robots perceive their environment. Analyzing and fusing this data is fundamental to building an autonomous system.
In this Nanodegree program, you will work with camera images, radar signatures, and lidar point clouds to detect and track vehicles and pedestrians. By graduation, you will have an impressive portfolio of projects to demonstrate your skills to employers.
Size: 2.54 GB
https://www.udacity.com/course/sensor-fusion-engineer-nanodegree–nd313.