Spaces:
Running
Running

Initial commit of the Computer Vision Journey presentation, including main application files, project pages, assets, and configuration. Added .gitignore to exclude unnecessary files and created requirements.txt for dependencies.
27818c1
unverified
import streamlit as st | |
import plotly.graph_objects as go | |
from streamlit_extras.badges import badge | |
import sys | |
import os | |
# Add the root directory to the path | |
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))) | |
# Import helper functions | |
from utils.helpers import display_asset_or_placeholder, display_iframe_or_link | |
# Set page configuration | |
st.set_page_config( | |
page_title="Black Bee Drones | CV Journey", | |
page_icon="assets/black_bee.png", | |
layout="wide", | |
initial_sidebar_state="expanded", | |
) | |
# Title and introduction | |
st.header("🐝 Black Bee Drones - Autonomous Missions") | |
st.markdown( | |
""" | |
### First Autonomous Drone Team in Latin America | |
I joined the Black Bee Drones team in 2023 as a software member and continue to this day. The team, based at UNIFEI, | |
focuses on indoor and outdoor autonomous mission competitions, and we build our own drones from the ground up. | |
**Key Focus Areas:** | |
- Autonomous indoor/outdoor missions | |
- Custom drone building and integration | |
- Software development for autonomous flight | |
- Computer vision algorithms for navigation and object detection | |
**Main Competition:** International Micro Air Vehicles (IMAV) Conference and Competition | |
**Notable Achievement:** 3rd place in IMAV 2023 Indoor competition with a special award for being | |
the only team to perform the flight completely autonomously. | |
""" | |
) | |
# Create columns for team structure | |
st.subheader("Team Structure") | |
col1, col2, col3, col4 = st.columns(4) | |
with col1: | |
st.markdown("#### Hardware Team") | |
st.markdown( | |
""" | |
- Controller boards (PixHawk) | |
- Onboard computers (Raspberry Pi 4) | |
- Cameras (RaspCam, Oak-D) | |
- Positioning systems (GPS, LiDAR) | |
""" | |
) | |
with col2: | |
st.markdown("#### Software Team") | |
st.markdown( | |
""" | |
- Computer Vision algorithms | |
- Mapping & localization | |
- Position estimation | |
- Autonomous control | |
""" | |
) | |
with col3: | |
st.markdown("#### Mechanics Team") | |
st.markdown( | |
""" | |
- Frame design | |
- 3D printing | |
- Propulsion systems | |
- Component arrangement | |
""" | |
) | |
with col4: | |
st.markdown("#### Management Team") | |
st.markdown( | |
""" | |
- Competition strategy | |
- Documentation | |
- Team organization | |
- Resource allocation | |
""" | |
) | |
st.markdown("---") | |
# Technologies section | |
st.subheader("Core Technologies & Concepts") | |
tech_tab1, tech_tab2, tech_tab3 = st.tabs( | |
["Software Stack", "CV Techniques", "Hardware Components"] | |
) | |
with tech_tab1: | |
col1, col2 = st.columns(2) | |
with col1: | |
st.markdown( | |
""" | |
#### Main Software Tools | |
- **OpenCV:** Image processing and computer vision | |
- **ROS (Robot Operating System):** Distributed computing for robotics | |
- **TensorFlow/PyTorch:** Deep learning frameworks | |
- **Docker:** Containerization for deployment | |
- **MAVLink/MAVROS:** Drone communication protocols | |
""" | |
) | |
with col2: | |
st.markdown( | |
""" | |
#### Programming Languages | |
- **Python:** Main language for CV and high-level control | |
- **C++:** Performance-critical components and ROS nodes | |
""" | |
) | |
with tech_tab2: | |
st.markdown( | |
""" | |
#### Computer Vision & AI Techniques | |
- **Basic Image Processing:** Filters, morphological operations, thresholding | |
- **Feature Detection:** Corners, edges, and contours | |
- **Marker Detection:** ArUco markers for localization | |
- **Object Detection:** Custom models for mission-specific objects | |
- **Line Following:** Color segmentation and path estimation | |
- **Hand/Face Detection:** Using MediaPipe for gesture control | |
- **Visual Odometry:** For position estimation in GPS-denied environments | |
""" | |
) | |
with tech_tab3: | |
col1, col2 = st.columns(2) | |
with col1: | |
st.markdown( | |
""" | |
#### Control & Computing | |
- **PixHawk:** Flight controller board | |
- **Raspberry Pi 4:** Onboard computer | |
- **ESCs & Motors:** Propulsion system | |
- **Battery:** Power source | |
""" | |
) | |
with col2: | |
st.markdown( | |
""" | |
#### Sensors & Perception | |
- **RaspCam/Oak-D:** Cameras for visual perception | |
- **GPS:** Outdoor positioning (when available) | |
- **LiDAR:** Distance sensing and mapping | |
- **RealSense T265:** Visual-inertial odometry | |
- **PX4 Flow:** Optical flow sensor for position holding | |
""" | |
) | |
st.markdown("---") | |
# OpenCV Demo section | |
st.subheader("Demo: Real-time OpenCV Operations") | |
st.markdown( | |
""" | |
Basic image processing is fundamental to drone perception. This demo showcases real-time: | |
- Various image filters and transformations | |
- ArUco marker detection (used for drone localization) | |
- Hand and face detection using MediaPipe | |
""" | |
) | |
display_iframe_or_link("https://samuellimabraz-opencv-gui.hf.space", height=800) | |
st.caption( | |
"Link to Hugging Face Space: [OpenCV GUI Demo](https://samuellimabraz-opencv-gui.hf.space)" | |
) | |
st.markdown("---") | |
# Line Following Challenge | |
st.subheader("IMAV 2023 Indoor Mission: Line Following Challenge") | |
col1, col2 = st.columns(2) | |
with col1: | |
st.markdown( | |
""" | |
### The Challenge | |
The [IMAV 2023 Indoor](https://2023.imavs.org/index.php/indoor-competition/) Mission required drones to: | |
1. Navigate using ArUco markers for initial positioning | |
2. Follow a colored line on the floor to reach a deposit location | |
3. Deliver a block autonomously | |
This mission tested precise control, vision-based navigation, and autonomous decision-making. | |
""" | |
) | |
display_asset_or_placeholder( | |
"imav_mission_diagram.jpg", | |
caption="Diagram of the IMAV 2023 Indoor Mission", | |
use_column_width=True, | |
) | |
with col2: | |
st.markdown( | |
""" | |
### Line Following Algorithm | |
I developed a robust line-following algorithm consisting of: | |
1. **Color Filtering:** Isolate the colored line using HSV thresholding in OpenCV | |
2. **Line Detection & Orientation:** Estimate the line's position and direction | |
3. **PID Controller:** Adjust the drone's heading based on the line's position relative to the center | |
The algorithm was robust to varying lighting conditions and line widths, which was crucial for the competition environment. | |
""" | |
) | |
# Create a simple PID visualization | |
st.markdown("#### PID Control Visualization") | |
fig = go.Figure() | |
# Create data for the PID controller visualization | |
import numpy as np | |
# Time points | |
t = np.linspace(0, 10, 100) | |
# Target (setpoint) | |
setpoint = np.ones_like(t) * 0 | |
# PID response for roll (center line error) | |
center_error = np.sin(t) * np.exp(-0.3 * t) | |
roll_output = -center_error * 0.8 | |
# PID response for yaw (angle error) | |
angle_error = np.cos(t) * np.exp(-0.4 * t) | |
yaw_output = -angle_error * 0.7 | |
# Add traces | |
fig.add_trace( | |
go.Scatter( | |
x=t, | |
y=setpoint, | |
mode="lines", | |
name="Setpoint", | |
line=dict(color="green", width=2, dash="dash"), | |
) | |
) | |
fig.add_trace( | |
go.Scatter( | |
x=t, | |
y=center_error, | |
mode="lines", | |
name="Center Line Error", | |
line=dict(color="red", width=2), | |
) | |
) | |
fig.add_trace( | |
go.Scatter( | |
x=t, | |
y=angle_error, | |
mode="lines", | |
name="Angle Error", | |
line=dict(color="orange", width=2), | |
) | |
) | |
fig.add_trace( | |
go.Scatter( | |
x=t, | |
y=roll_output, | |
mode="lines", | |
name="Roll Correction", | |
line=dict(color="blue", width=2), | |
) | |
) | |
fig.add_trace( | |
go.Scatter( | |
x=t, | |
y=yaw_output, | |
mode="lines", | |
name="Yaw Correction", | |
line=dict(color="purple", width=2), | |
) | |
) | |
# Update layout | |
fig.update_layout( | |
title="PID Controllers for Line Following", | |
xaxis_title="Time", | |
yaxis_title="Error / Correction", | |
legend=dict(y=0.99, x=0.01, orientation="h"), | |
margin=dict(l=0, r=0, t=40, b=0), | |
height=300, | |
) | |
st.plotly_chart(fig, use_container_width=True) | |
# Demo iFrame | |
st.markdown("### Line Following Simulation Demo") | |
display_iframe_or_link("https://samuellimabraz-line-follow-pid.hf.space", height=1200) | |
st.caption( | |
"Link to Hugging Face Space: [Line Follow PID Demo](https://samuellimabraz-line-follow-pid.hf.space)" | |
) | |
# Video demo | |
st.markdown("### Real Flight Footage (IMAV 2023)") | |
display_asset_or_placeholder( | |
"drone_line_following_video.mp4", | |
asset_type="video", | |
caption="Black Bee Drone executing the line following task during IMAV 2023", | |
) | |
st.markdown("---") | |
st.markdown( | |
""" | |
### Team Recognition | |
This work was made possible by the incredible Black Bee Drones team at UNIFEI. Special thanks to all members | |
who contributed their expertise in hardware, software, mechanics, and management. | |
""" | |
) | |
st.markdown( | |
"[Black Bee Drones](https://www.linkedin.com/company/blackbeedrones/posts/?feedView=all)" | |
) | |