Project GlyphMotion
Comprehensive Documentation
This project provides a web-based interface for real-time object tracking in videos. Users can upload a video file or provide a video URL, which is then processed by a local backend system. The system leverages the Ultralytics YOLOv8(m) model for object detection and tracking and integrates with GitHub Pages and Google Drive for storing and serving processed results.
Use Cases
- Video Production & Post-Production – Automates object tracking, significantly reducing manual effort.
- Security & Surveillance – Detects and tracks individuals or vehicles for anomaly detection and monitoring.
- Sports Analytics – Analyzes player movements, ball trajectories, and team strategies.
- Traffic Monitoring – Tracks vehicles to gather insights on traffic flow and incident detection.
- Retail Analytics – Monitors customer movement to optimize layouts and product placements.
- Robotics & Autonomous Systems – Provides real-time tracking for robotic navigation.
- Wildlife Monitoring – Tracks animal movements for ecological studies.
Features
- Video Upload & URL Input – Supports both local uploads and direct URLs.
- YouTube Video Support - Direct video uploads from YouTube URLs are supported via `yt-dlp` integration.
- Real-time Object Tracking – Uses YOLOv8(m) for efficient detection.
- Live Status Updates – Provides real-time feedback on processing.
- Processed Video Gallery – Hosts previously processed videos with links and downloads.
- Responsive Design – Works across desktop, tablet, and mobile devices.
- Progressive Web App (PWA) Support – Installable on desktop and mobile for an enhanced, app-like experience.
- Dynamic Layout Options – Toggle between 2-video or 3-video per row layouts.
- View Mode Toggle – Switch between compact and expanded views.
- Persistent Settings – Saves user preferences locally.
- GitHub Pages & Google Drive Integration – Automates metadata commits and manages storage.
- Telegram Bot Access – Allows interaction with the backend via Telegram.
- Admin Mode – Securely log in as an administrator to access advanced features like video deletion directly from the gallery.
- Secure Admin Authentication – Backend admin verification utilizes
bcrypt
hashing for robust password security. - Admin Tracking Dashboard – A dedicated panel (
admin_tracker.html
) to view comprehensive details of all processed videos, including GitHub commit SHAs, client IP addresses, and geolocation data (country, city, ISP) of the original request. - Admin Password Management – A script (
admin_hash_gen.py
) for securely adding new admin users or updating existing admin passwords. - Master Admin Control – Master administrators can invalidate all active admin sessions with a single click from the website.
- Frame Count Restriction & Timeout – To ensure stable performance and prevent excessive processing times for very large videos, the backend includes a configurable frame count limit. Videos exceeding this limit will be rejected immediately after download. A timeout is also applied during frame count determination to prevent indefinite hangs. These settings are fully configurable in
tg.py
. - Centralized Message Display: Frontend now uses a central, temporary message overlay to provide clear feedback for actions like video rejection, installation prompts, or other important system messages. The display duration of these messages is configurable.
Project Structure
.
├── index.html # Main web interface (frontend)
├── admin.html # Admin login page
├── admin_tracker.html # Admin dashboard for tracking processed videos and request origins
├── manifest.json # PWA: Web App Manifest for installability
├── service-worker.js # PWA: Service Worker for offline capabilities and caching
├── images/
└── project-glyph-motion.ico # Favicon for the website
└── project-glyph-motion.png # Project logo
└── project-glyph-motion-192x192.png # PWA icon (192x192)
└── project-glyph-motion-512x512.png # PWA icon (512x512)
└── project-glyph-motion-maskable.png # PWA maskable icon
└── thumbnail_fallback.jpg # Fallback thumbnail for videos
├── input/ # Directory for uploaded video files (processed by the backend)
├── output/ # Directory for processed video files (generated by the backend)
├── yolov8m.pt # Actual Pre-trained YOLOv8(m) model file (for object tracking) : Ultralytics YOLOv8(m) model file
├── tg.py (telegram.py) # Backend orchestrator: handles web requests, triggers `ot.py`, integrates GitHub/Drive, and powers the Telegram bot.
├── ot.py (object_tracker.py) # Core video processor: performs object tracking with YOLOv8(m).
├── gh.py (github.py) # Manages GitHub commits and Google Drive integration.
├── client_secret.json # Google Drive API credentials (for authenticated access)
├── config.yml # Configuration file for the backend (Paths and endpoints, etc.) [See the actual project config.yml for details]
├── admin_auth.py # Handles secure admin authentication (stores bcrypt hashes)
├── admin_hash_gen.py # Script to generate and update admin credentials directly in admin_auth.py
├── documentation.html # This documentation file
├── requirements.txt # Lists all Python dependencies
├── web_server_public.py # Python script to run the Flask web server and Cloudflare Tunnel (Automation of the whole backend)
├── token.json # Google Drive API token file (for authenticated access)
├── videos.json # JSON file containing metadata for processed videos (links, titles, etc.)
├── tracking_data.json # JSON file containing tracking data for processed videos (IP Address, Lat/Lon and object IDs, timestamps, etc.)
├── .gitignore # Specifies files and directories to ignore in Git
├── README.md # Project overview and setup instructions
└──LICENSE # Project license file
IMPORTANT WARNING: DO NOT MODIFY ANY CODE UNLESS EXPLICITLY INSTRUCTED. INCORRECT MODIFICATIONS CAN LEAD TO PROJECT FAILURE.
IMPORTANT NOTE ON FILE MANAGEMENT:
Only index.html
, videos.json
, and the PWA files (manifest.json
, service-worker.js
) are strictly necessary to be uploaded to your GitHub repository and maintained there for the frontend.
All other backend-related files (Python scripts, credentials, `tracking_data.json`, `config.yml`, etc.) are most recommended to be kept *only* on the local machine where the backend server is running. This ensures that no sensitive code or data is inadvertently pushed to GitHub, regardless of whether your repository is public or private.
How to Get a Free Custom Domain:
If you're a student, you can often get a free custom domain through the GitHub Education Pack. Popular registrars like Namecheap or Name.com offer free domains as part of this pack. You will typically need to verify your student status using your school/college ID card or academic email. Simply apply for the GitHub Education Pack, and once approved, you can claim your free domain.
Frontend Setup (Website)
The frontend consists of index.html
, admin.html
, and admin_tracker.html
which use Tailwind CSS for styling and JavaScript for interactivity. It's recommended to set up the frontend first to understand the website's structure and then configure the backend.
Before proceeding with the website setup, it is crucial to read the "Security and Data Privacy (Admin Tracker)" section at the very end of this document to understand the implications of data collection and your responsibilities as a project owner.
1. Clone the Repository:
git clone <your-repository-url>
cd <your-repository-name>
2. Favicon Path:
Ensure the favicon path is correct in index.html
, admin.html
, and admin_tracker.html
:
<link rel="icon" type="image/svg+xml" href="images/project-glyph-motion.ico">
Make sure the images
folder and project-glyph-motion.ico
file exist in your project's root directory relative to the HTML files.
3. Configure Frontend JavaScript Variables in index.html
, admin.html
, and admin_tracker.html
:
Open these HTML files and locate the <script>
section. You'll find important configuration variables. These URLs are examples; ensure they match your actual setup.
For index.html
:
// IMPORTANT: This should now be your custom domain for GitHub Pages
const GITHUB_PAGES_REPO_URL = 'https://projectglyphmotion.studio/'; // Example: Replace with your GitHub Pages URL
const VIDEOS_JSON_PATH = 'videos.json';
const POLLING_INTERVAL_VIDEOS = 10000; // Poll videos.json every 10 seconds
const POLLING_INTERVAL_STATUS = 2000; // Poll live status every 2 seconds
const POLLING_INTERVAL_SERVER_HEALTH = 5000; // Poll server health every 5 seconds (new)
// IMPORTANT: This should be your custom domain for the backend tunnel
const NGROK_PUBLIC_URL = 'https://vot.onehorizon.me'; // Example: Replace with your Cloudflare Tunnel custom domain
const LOCAL_API_ENDPOINT = NGROK_PUBLIC_URL + '/process_web_video';
const LOCAL_STATUS_ENDPOINT = NGROK_PUBLIC_URL + '/status';
// ... other endpoints like LOCAL_DELETE_ENDPOINT, LOCAL_COMMIT_INFO_ENDPOINT, LOCAL_LOGOUT_ENDPOINT, etc.
// Configuration for how long central messages are displayed (in milliseconds)
const CENTRAL_MESSAGE_DISPLAY_DURATION_MS = 5000; // 5 seconds (configurable)
For admin.html
and admin_tracker.html
:
// IMPORTANT: This should be your custom domain for the backend tunnel
const NGROK_PUBLIC_URL = 'https://vot.onehorizon.me'; // Example: Replace with your Cloudflare Tunnel custom domain
const LOCAL_LOGIN_ENDPOINT = NGROK_PUBLIC_URL + '/login'; // For admin.html
const LOCAL_TRACKER_DATA_ENDPOINT = NGROK_PUBLIC_URL + '/admin_tracker_data'; // For admin_tracker.html
// ... other relevant endpoints
GITHUB_PAGES_REPO_URL
: This URL specifies where yourvideos.json
file (containing links to processed videos) is hosted. The valuehttps://projectglyphmotion.studio/
is an example. After setting up your GitHub Pages, you MUST replace this with the base URL of your GitHub Pages site. For example, if your repository isyour-username/my-video-tracker
, and it's published to GitHub Pages, the URL might behttps://your-username.github.io/my-video-tracker/
. Ensure it ends with a trailing slash/
.NGROK_PUBLIC_URL
: Although namedNGROK_PUBLIC_URL
, this variable is used for your Cloudflare Tunnel domain. The valuehttps://vot.onehorizon.me
is an example. Theweb_server_public.py
script will automatically update this value to yourCLOUDFLARE_CUSTOM_DOMAIN
when you run it.LOCAL_API_ENDPOINT
andLOCAL_STATUS_ENDPOINT
are derived from this.- The
POLLING_INTERVAL
variables control how frequently the frontend checks for updates from the backend and GitHub Pages. CENTRAL_MESSAGE_DISPLAY_DURATION_MS
: This new configurable value inindex.html
determines how long the central, temporary messages (e.g., video rejection, installation prompts) are displayed on the screen in milliseconds. Default is 5000ms (5 seconds).- Master Admin UI Visibility: For the "Logout All Admins" button and other master admin options to be visible on the website, you must set the
MASTER_ADMIN_USERNAMES_FRONTEND
array inindex.html
to include the exact usernames of your master administrators. This is a frontend UI setting only.
Example (inindex.html
):const MASTER_ADMIN_USERNAMES_FRONTEND = ["your_master_admin_username_1", "your_master_admin_username_2"];
4. Progressive Web App (PWA) Integration:
Project GlyphMotion supports Progressive Web App (PWA) features, allowing users to install the application to their home screen on both desktop and mobile devices for an enhanced, app-like experience. This is achieved through manifest.json
and service-worker.js
.
manifest.json
: This file provides metadata about your web application, including its name, icons, start URL, display mode, and theme colors. Browsers use this information to present your PWA as a native-like application. Ensure this file is correctly configured for your project's branding and desired behavior.service-worker.js
: This JavaScript file runs in the background, independent of the main browser thread. It enables powerful features like offline support, caching of assets, and push notifications. The service worker in this project is configured to cache essential application files, ensuring a faster loading experience on subsequent visits and basic offline functionality.
No direct configuration is usually needed for manifest.json
or service-worker.js
unless you want to customize the PWA's appearance or caching strategy. Ensure these files are present in your project's root directory.
5. Open the Website (Local Preview):
You can simply open the index.html
file in your web browser to see the frontend. The backend functionality will not work yet, but you can preview the UI.
Frontend Static Pages Serving (GitHub Pages DNS Configuration)
To ensure that the Project GlyphMotion frontend (e.g., https://projectglyphmotion.studio
) loads reliably and independently of your local backend's tunnel status, we configure it to serve static pages directly from GitHub Pages. This involves setting up DNS records in Cloudflare to point your root domain to GitHub's servers. This guarantees your HTML, CSS, JavaScript, and videos.json
files load consistently from GitHub's global content delivery network.
Steps to Configure GitHub Pages for Direct Domain Serving:
- Verify GitHub Pages Setup:
Ensure your GitHub repository has GitHub Pages enabled. Go to your repository settings on GitHub, navigate to "Pages," and select the branch from which to publish (usually
main
ormaster
). If you're using a custom domain, ensure it's specified there. - Add Custom Domain (if applicable):
If you wish to use a custom domain (e.g.,
projectglyphmotion.studio
instead ofyour-username.github.io/your-repo/
), add it in your GitHub repository's "Pages" settings under "Custom domain." - Configure DNS Records in Cloudflare:
Assuming you are using Cloudflare for DNS management (highly recommended for performance and security), you will need to add specific DNS records. These records instruct browsers to connect directly to GitHub's infrastructure without passing through the Cloudflare Tunnel. For our example domain
projectglyphmotion.studio
, we configured the following:For IPv4 Traffic (A Records):
You will add multiple
A
records pointing to GitHub Pages' IPv4 addresses. Forprojectglyphmotion.studio
, these were:- Type:
A
, Name:@
(or your domain/subdomain), IPv4 address:185.199.108.153
- Type:
A
, Name:@
, IPv4 address:185.199.109.153
- Type:
A
, Name:@
, IPv4 address:185.199.110.153
- Type:
A
, Name:@
, IPv4 address:185.199.111.153
- Type:
A
, Name:@
, IPv4 address:192.30.252.153
- Type:
A
, Name:@
, IPv4 address:192.30.252.154
For IPv6 Traffic (AAAA Records):
Similarly, add
AAAA
records for IPv6 support:- Type:
AAAA
, Name:@
, IPv6 address:2606:50c0:8000::153
- Type:
AAAA
, Name:@
, IPv6 address:2606:50c0:8001::153
- Type:
AAAA
, Name:@
, IPv6 address:2606:50c0:8002::153
- Type:
AAAA
, Name:@
, IPv6 address:2606:50c0:8003::153
IMPORTANT: For all these records (A and AAAA), ensure the "Proxy status" in Cloudflare is set to "DNS only" (grey cloud icon). This is critical to direct traffic straight to GitHub Pages instead of routing it through Cloudflare's proxy. If you see an orange cloud icon, click it to change it to grey. Note that these IP addresses are universal for GitHub Pages hosting and apply to any GitHub Pages project, whether hosted on a personal account or an organization account. They are not exclusive to a single project or user.
- Type:
- DNS Propagation:
After saving these DNS records, it may take some time for the changes to propagate across the internet. You can use tools like whatsmydns.net to check the propagation status for your domain and the specified IP addresses.
- Update GitHub Pages Custom Domain Settings:
Once the DNS records are propagating, go back to your GitHub repository's "Pages" settings. Under "Custom domain," enter your root domain (e.g.,
projectglyphmotion.studio
) and save. GitHub will verify the DNS records. If successful, your site will now be served directly from your custom domain.
This static serving setup ensures that your website's interface is always available to users, providing a consistent user experience regardless of whether your local backend is online. Any dynamic functionalities (like starting a new video analysis or accessing admin data) are handled by separate API calls to your backend subdomain (e.g., backend.projectglyphmotion.studio
or vot.onehorizon.me
), which still relies on the Cloudflare Tunnel. While these IP addresses are currently the official ones for GitHub Pages and are expected to be stable for a long time, it's always a good practice to periodically check GitHub's official documentation if you encounter unexpected loading issues, as these IPs can be updated by GitHub.
Backend Setup
The backend system consists of ot.py
(object_tracker.py), tg.py
(telegram.py), gh.py
(github.py), admin_auth.py
, and admin_hash_gen.py
.
1. Install Python Dependencies:
You can install all necessary Python dependencies by running the following command:
pip install -r requirements.txt
The requirements.txt
file is located in the project's root directory and lists all required libraries, including:
ultralytics
: The core library for YOLOv8(m) object tracking (used byot.py
).Flask
: A micro web framework for building the backend API (used bytg.py
).Flask-Cors
: For handling Cross-Origin Resource Sharing.python-dotenv
: To load environment variables (if you choose to use a.env
file for secrets).python-telegram-bot
: For building the Telegram bot functionality (used bytg.py
).GitPython
: For programmatically interacting with Git repositories (used bygh.py
).google-api-python-client
,google-auth-httplib2
,google-auth-oauthlib
: For Google Drive API integration (used bygh.py
).psutil
: For monitoring system resource usage (used byot.py
).bcrypt
: For secure password hashing (used byadmin_auth.py
andadmin_hash_gen.py
).pyjwt
: For JSON Web Token (JWT) handling (used bytg.py
for session management).yt-dlp
: For downloading videos from various web sources, including YouTube.
2. Configure tg.py
(telegram.py):
Open tg.py
and update the following configuration variables. These settings are fully configurable to suit your needs:
USE_GITHUB_PAGES = True # Set to True to enable GitHub Pages integration (commits to repo, updates videos.json)
# Frame Restriction Configuration
FRAME_RESTRICTION_ENABLED = True # Set to True to enable frame count restriction
FRAME_RESTRICTION_VALUE = 7000 # Max allowed frames for video processing
FFPROBE_TIMEOUT_SECONDS = 30 # Timeout for ffprobe command in seconds
USE_GITHUB_PAGES
: This switch enables or disables the integration with GitHub Pages and Google Drive. Set toTrue
to use it,False
otherwise.FRAME_RESTRICTION_ENABLED
: Set toTrue
to enable frame count checking for incoming videos. This helps prevent processing excessively long videos that might cause resource issues or timeouts. IfFalse
, no frame check will be performed.FRAME_RESTRICTION_VALUE
: IfFRAME_RESTRICTION_ENABLED
isTrue
, this value defines the maximum number of frames a video can have to be accepted for processing. Videos exceeding this will be rejected with a user-friendly message.FFPROBE_TIMEOUT_SECONDS
: This new setting defines the maximum time (in seconds) allowed for `ffprobe` to determine the video's frame count. If `ffprobe` doesn't respond within this timeout, it will be killed, and the video will be rejected, preventing indefinite hangs on problematic files.
The Telegram bot functionality is pre-configured within tg.py
. You just need to provide your bot's unique token:
TELEGRAM_BOT_TOKEN = "YOUR_TELEGRAM_BOT_TOKEN_HERE" # Replace with your bot token
You can obtain a bot token from BotFather on Telegram.
For admin functionality, ensure your JWT_SECRET_KEY
is strong and random. Also, configure MASTER_ADMIN_USERNAMES
if you want specific admins to have the ability to log out all other active sessions:
# JWT Secret Key (VERY IMPORTANT: Replace with a strong, random key in production!)
JWT_SECRET_KEY = 'YOUR_VERY_SECRET_JWT_KEY_HERE' # e.g., 'f9a8b7c6d5e4f3a2b1c0d9e8f7a6b5c4'
# Master Admin Usernames (only these users can trigger global logout and other master actions)
# These users must also exist in ADMIN_CREDENTIALS in admin_auth.py
MASTER_ADMIN_USERNAMES = {"your_admin_username_1", "your_admin_username_2"} # Add more usernames to this set
You can generate a cryptographically secure key by opening a Python terminal and running:
import secrets
secrets.token_hex(32)
Copy the output and paste it as the value for your JWT_SECRET_KEY
.
3. Configure admin_auth.py
and admin_hash_gen.py
:
The admin_auth.py
file holds the bcrypt hashes of your admin credentials. DO NOT manually edit the hashes here unless you understand the bcrypt format. Instead, use the admin_hash_gen.py
script to manage admin users and their passwords securely.
Run admin_hash_gen.py
to add your initial admin users:
python admin_hash_gen.py
Follow the prompts to create or update admin usernames and passwords. This script will automatically update admin_auth.py
with the new bcrypt hashes.
You can also configure session timeout for admin logins in admin_auth.py
:
SESSION_TIMEOUT_ENABLED = True # Set to True or False
SESSION_DURATION_DAYS = 7 # Example: 7 days
4. Configure gh.py
(github.py):
Open gh.py
and update the following configuration variables. It is highly recommended to use environment variables for sensitive information like GITHUB_ACCESS_TOKEN
in a production environment.
# GitHub API
_HARDCODED_GITHUB_ACCESS_TOKEN = "YOUR_GITHUB_PERSONAL_ACCESS_TOKEN_HERE" # <--- IMPORTANT: Replace with your actual GitHub PAT
GITHUB_ACCESS_TOKEN = _HARDCODED_GITHUB_ACCESS_TOKEN if _HARDCODED_GITHUB_ACCESS_TOKEN != "YOUR_GITHUB_PERSONAL_ACCESS_TOKEN_HERE" else os.getenv('GITHUB_ACCESS_TOKEN')
_HARDCODED_GITHUB_USERNAME = "YOUR_GITHUB_USERNAME_HERE" # <--- Replace with your GitHub username
GITHUB_USERNAME = _HARDCODED_GITHUB_USERNAME if _HARDCODED_GITHUB_USERNAME != "YOUR_GITHUB_USERNAME_HERE" else os.getenv('GITHUB_USERNAME')
_HARDCODED_GITHUB_REPO_NAME = "YOUR_GITHUB_REPO_NAME_HERE" # <--- Replace with your repository name
GITHUB_REPO_NAME = _HARDCODED_GITHUB_REPO_NAME if _HARDCODED_GITHUB_REPO_NAME != "YOUR_GITHUB_REPO_NAME_HERE" else os.getenv('GITHUB_REPO_NAME')
# Branch to commit to (usually 'main' or 'master' for GitHub Pages)
GITHUB_BRANCH = 'main'
# Path to the JSON file in your GitHub repository that stores video metadata
GITHUB_VIDEOS_JSON_PATH = 'videos.json'
# Google Drive API
# Ensure client_secret.json is in the same directory or provide its path
GOOGLE_DRIVE_CLIENT_SECRET_FILE = 'client_secret.json'
GOOGLE_DRIVE_TOKEN_FILE = 'token.json' # This file will be created after first authentication
# Name of the parent folder in Google Drive for processed videos (e.g., 'ObjectTrackerMaster/output')
GOOGLE_DRIVE_OUTPUT_FOLDER_NAME = 'ObjectTrackerMaster/output'
- Google Drive API Setup (Crucial for Public Access):
To enablegh.py
to upload videos to your Google Drive and make them publicly accessible for embedding on your GitHub Pages site, you need to set up Google Drive API credentials.- Create a Project in Google Cloud Console: Go to the Google Cloud Console and create a new project.
- Enable Google Drive API: In your new project, navigate to "ApIs & Services" > "Enabled APIs & Services". Search for "Google Drive API" and enable it.
- Create OAuth Consent Screen: Go to "ApIs & Services" > "OAuth consent screen".
- Choose "External" user type.
- Fill in the required app information (app name, user support email, developer contact information).
- Add the scope
.../auth/drive.file
. - Set Publishing Status to "Production": This is critical! If your app remains in "Testing" status, only test users you explicitly add will be able to access the service. For anyone (including yourself on different machines or users of your public website) to be able to use the Google Drive integration, the OAuth consent screen MUST be in "Production" status.
- Create Credentials: Go to "ApIs & Services" > "Credentials".
- Click "CREATE CREDENTIALS" and choose "OAuth client ID".
- Select "Desktop app" as the application type.
- Give it a name and click "Create".
- You will be presented with your client ID and client secret. Download the
client_secret.json
file. This file often has a long name (e.g.,client_secret_xxxxxxxxxxxx.apps.googleusercontent.com.json
). Please rename it toclient_secret.json
and place it in the same directory as yourgh.py
script.
- First-time Authentication: The first time
gh.py
runs and attempts to upload a video, it will open a browser window asking you to authenticate with your Google account. This will generate atoken.json
file in the same directory, which stores your credentials for future use.
token.json
andclient_secret.json
:
These files contain sensitive authentication credentials that allow programmatic access to your Google Drive. DO NOT committoken.json
orclient_secret.json
to your Git repository, even if it's private. If these files are ever exposed, it could compromise your Google Drive account. These files are generated locally and should remain local to the machine running the backend. If you need to transfer your setup to another machine, transfer them securely (e.g., via a secure copy protocol), not through Git.
To prevent accidental commits, it is highly recommended to addclient_secret.json
andtoken.json
to your project's.gitignore
file:# .gitignore client_secret.json token.json
- Your Account Space: When
gh.py
uploads processed videos, they will be stored in your Google Drive account, consuming your available storage space. Keep this in mind, especially for large or numerous video files. - Folder Structure:
gh.py
is configured to create a folder namedObjectTrackerMaster
in your Google Drive, and inside it, a subfolder namedoutput
. All processed video files will be saved within thisObjectTrackerMaster/output
folder.
5. Cloudflare Tunnel Setup:
- Install
cloudflared
CLI: Follow the instructions for your operating system on the Cloudflare Tunnel documentation. - Authenticate
cloudflared
:
cloudflared tunnel login
This will open a browser window to authenticate with your Cloudflare account.
- Create a Tunnel (if you haven't already):
When creating your tunnel, use the nameproject-glyph-motion-tunnel
:
cloudflared tunnel create project-glyph-motion-tunnel
This will generate a tunnel ID and a credential file.
- Configure DNS for your Custom Domain:
Before running the tunnel, you need to configure your custom domain to point to Cloudflare's nameservers. This step is crucial for Cloudflare to manage traffic for your domain.- Get Cloudflare Nameservers: After adding your domain to Cloudflare, Cloudflare will provide you with two nameservers (e.g.,
john.ns.cloudflare.com
andmary.ns.cloudflare.com
). - Update Nameservers at your Domain Registrar: Go to your domain registrar's website (e.g., Namecheap, Name.com). Find the DNS management or nameserver settings for your domain.
- Replace Existing Nameservers: Change the existing nameservers to the ones provided by Cloudflare. Save the changes.
- Add CNAME Record in Cloudflare DNS: In your Cloudflare DNS settings, add a CNAME record for your chosen subdomain (e.g.,
vot
). TheTarget
for this CNAME record will be your tunnel's UUID followed by.cfargotunnel.com
.
- Type:
CNAME
- Name:
your_subdomain
(e.g.,vot
) - Target:
<YOUR_TUNNEL_UUID_HERE>.cfargotunnel.com
(e.g.,a1b2c3d4-e5f6-7890-1234-567890abcdef.cfargotunnel.com
) - Proxy Status: Ensure it's set to "Proxied" (orange cloud icon).
- Type:
- Propagation: DNS changes can take up to 10 minutes to propagate globally, though it's often faster. You can actively check the propagation status using whatsmydns.net. Also, remember to check the "Start Propagation Check" option in your Cloudflare settings. Your domain won't work with Cloudflare Tunnel until this propagation is complete.
- Get Cloudflare Nameservers: After adding your domain to Cloudflare, Cloudflare will provide you with two nameservers (e.g.,
- Create a
config.yml
file: Save thisconfig.yml
file inside your project folder (Project-Glyph-Motion
). This file tells Cloudflare Tunnel where to route traffic. Note: The tunnel configuration (cloudflared tunnel create
andconfig.yml
setup) is a one-time process.
# config.yml # Save this file inside your Project-Glyph-Motion project folder. # The ID of your Cloudflare Tunnel. # You will get this UUID when you run 'cloudflared tunnel create project-glyph-motion-tunnel'. tunnel: <YOUR_TUNNEL_ID_HERE> # Example: a1b2c3d4-e5f6-7890-1234-567890abcdef # The path to your Cloudflare Tunnel credentials file. # This was generated when you run 'cloudflared login'. # Example: /home/user/.cloudflared/a1b2c3d4-e5f6-7890-1234-567890abcdef.json credentials-file: /path/to/your/.cloudflared/<YOUR_TUNNEL_ID_HERE>.json # Ingress rules define how traffic is routed through your tunnel. ingress: # This rule routes traffic from your custom domain to your local tg.py server. # IMPORTANT: Replace 'your-custom-domain.com' with the actual custom domain # (e.g., vot.onehorizon.me) that you will register and add to Cloudflare. - hostname: your-custom-domain.com service: http://localhost:5000 # Your Flask app's local address (tg.py's port) # This rule is a fallback. If no other ingress rule matches, Cloudflare will return a 404. - service: http_status:404
Important: Ensure your custom domain/subdomain (
hostname
) is properly configured in your Cloudflare DNS settings and pointed to the tunnel. - Run the Cloudflare Tunnel Server (First Step for Backend Operation):
Open a new terminal window. Navigate to your project folder and run the following command. The exact path tocloudflared
might vary, but/usr/local/bin/cloudflared
is a common default.
sudo /usr/local/bin/cloudflared --config config.yml tunnel run project-glyph-motion-tunnel
(Use
python3
for Linux/Ubuntu orpython
for Windows ifpython3
is not the default)
6. Run the Backend Orchestrator (tg.py
/ telegram.py):
Open a new terminal window. Navigate to your project folder and run:
python tg.py
(Use python3
for Linux/Ubuntu or python
for Windows if python3
is not the default)
Usage
1. Ensure Backend System is Running:
Make sure your tg.py
server is running locally AND the Cloudflare Tunnel is established. The "Server Status" indicator on the website should show "Online`.
To bring your backend system back online after stopping it, simply run these two commands (in separate terminal windows) from your project folder:
- Start Cloudflare Tunnel:
sudo /usr/local/bin/cloudflared --config config.yml tunnel run project-glyph-motion-tunnel
(Use
python3
for Linux/Ubuntu orpython
for Windows ifpython3
is not the default) - Start Backend Orchestrator:
python tg.py
(Use
python3
for Linux/Ubuntu orpython
for Windows ifpython3
is not the default)
Once these two commands are running, your backend will be fully operational again, and the "Server Status" indicator on your website (index.html
) will turn green, confirming that the backend is running and accessible.
2. Access the Project:
- Web Interface: Access your website via your GitHub Pages URL (e.g.,
https://your-username.github.io/your-repo-name/
). - Admin Login: Navigate to
admin.html
from your GitHub Pages URL to log in as an administrator. - Admin Tracker: Once logged in, you can access
admin_tracker.html
from your GitHub Pages URL (or the link on the main page) to view detailed tracking data. - Telegram Bot: Search for your bot's username on Telegram and start a chat. The bot is ready to use once
tg.py
is running.- Use the
/track
command followed by a video URL (e.g.,/track https://example.com/your_video.mp4
). - Note: Currently, direct local video file uploads via Telegram are not supported; please use a video URL with the
/track
command.
- Use the
3. Provide Video Source (Web Interface):
- Video URL: Paste a direct link to a video file (e.g.,
.mp4
,.mov
) into the "Video URL" field. - Upload Video File: Click "Browse" to select a video file from your local machine.
- Note: Providing a URL is generally faster as it avoids uploading the file to your local server.
4. Start Tracking (Web Interface):
Click the "Start Tracking" button.
5. Monitor Status:
The "Live Processing Status" area will update with messages from the backend system.
6. View Processed Videos:
Once processing is complete and tg.py
(via gh.py
) has updated the videos.json
file and committed it to GitHub, the new video will appear in the "Processed Videos" gallery. You can click "Download / View Full" to see the tracked video.
7. Toggle Layout/View:
Use the "Enlarge View" / "Compact View" button to switch the main content box size, and the "Show 2 Videos per Row" / "Show 3 Videos per Row" button to adjust the video gallery layout. Your preferences will be saved locally.
How Video Processing Works
1. When you click "Start Tracking" on the web interface, the frontend (index.html
) sends the video (either the file or URL) to your local tg.py
server via the Cloudflare Tunnel public URL (e.g., https://vot.onehorizon.me
).
2. tg.py
(telegram.py) receives the request. It then calls ot.py
(object_tracker.py) to perform the actual object tracking on the video.
3. After ot.py
completes processing, tg.py
orchestrates the post-processing steps by calling gh.py
(github.py):
gh.py
first uploads the processed video to Google Drive.- It then updates the
videos.json
file in your local repository with the new video's metadata (including the Google Drive embed link). - Finally,
gh.py
commits these changes to your GitHub repository.
4. The frontend (index.html
) continuously polls the videos.json
file (hosted on GitHub Pages) to update the gallery with new processed videos.
How this project works? - A visual representation
This diagram provides a high-level overview of the Project GlyphMotion architecture, showing how different components interact to enable real-time object tracking and video management.
Important Notes
ot.py
(object_tracker.py) as a Standalone Script:
While this project integratesot.py
into a full web and Telegram bot system,ot.py
itself is an independent script. You can use it privately for local video processing without setting up the entire web frontend or Cloudflare Tunnel. Simply run the Python script from your terminal and provide a video file (e.g.,python ot.py --input_video your_video.mp4
) or just drag and drop the file to the terminal window (no need to provide the flags). The processed video will be saved locally. However, for a richer experience with public access, gallery features, and Telegram bot integration, the full setup is recommended.- Backend System Must Be Running: The entire web application relies on
tg.py
(telegram.py) running locally and the Cloudflare Tunnel being active. If these are stopped, the tracking functionality will fail.
To bring your backend system back online after stopping it, simply run these two commands (in separate terminal windows) from your project folder:- Start Cloudflare Tunnel:
sudo /usr/local/bin/cloudflared --config config.yml tunnel run project-glyph-motion-tunnel
(Use
python3
for Linux/Ubuntu orpython
for Windows ifpython3
is not the default) - Start Backend Orchestrator:
python tg.py
(Use
python3
for Linux/Ubuntu orpython
for Windows ifpython3
is not the default)
Once these two commands are running, your backend will be fully operational again, and the "Server Status" indicator on your website (
index.html
) will turn green, confirming that the backend is running and accessible. - Start Cloudflare Tunnel:
- GPU Requirement for Performance:
The object tracking process inot.py
(object_tracker.py) is computationally intensive. For optimal performance, it is highly recommended to run this project on a device with a dedicated NVIDIA GPU that supports CUDA. Both Windows and Linux operating systems are supported for CUDA.
- With GPU (CUDA): In testing with a Ryzen 5 4600H CPU and a GTX 1650 Laptop GPU, processing a 4K video at 24 FPS yielded approximately 12 FPS. For a 1080p video at 24-60 FPS, the processing speed was around 24 FPS.
- CPU Only: If a dedicated GPU with CUDA is not available, the script will automatically fall back to CPU-only processing. While the project will still function, performance will be significantly slower. In testing, CPU-only mode (on the same Ryzen 5 4600H) processed both 4K and 1080p videos at a rate of approximately 1.2-1.6 FPS.
- GitHub Pages for
videos.json
with Private Repository (Crucial for Free Hosting):
The primary setup assumes you are using GitHub Pages to host yourvideos.json
file. This allows for easy updates and sharing of processed video links.
To run this project completely free of cost and keep your GitHub repository private while still having GitHub Pages accessible by anyone, you MUST have a GitHub Education Pack. This pack grants you access to features like private repositories with GitHub Pages. Without it, a private repository's GitHub Pages would not be publicly accessible, or you would need a public repository, which might not be desired for all projects.
Ensure your GitHub Pages is correctly configured and the repository is public (if you don't have GitHub Education) or private (if you have GitHub Education) if you want thevideos.json
to be publicly accessible. - Cloudflare Tunnel Configuration: Ensure your
config.yml
is correctly set up with your tunnel ID, credentials file path, and the correcthostname
that points to your local Flask server. Your domain's DNS records in Cloudflare must also be configured to route traffic through the tunnel. - Video Processing Time: Processing large or long videos can take a significant amount of time, depending on your local machine's specifications and internet speed.
- Google Drive Public Access: For videos to be embeddable and viewable on your GitHub Pages site, the uploaded videos on Google Drive must be set to public access (read-only).
gh.py
attempts to do this automatically.
Contributing
Feel free to fork this repository, submit pull requests, or open issues if you find bugs or have suggestions for improvements.
License
MIT License © 2025 Sayan Sarkar & Shitij Halder
Credits
Made with ❤️ by Sayan and Shitij
This project is based on the Ultralytics YOLOv8(m), an acclaimed real-time object detection and image segmentation model.
Security and Data Privacy (Admin Tracker)
Before you start exploring this project, you must know a few things regarding data privacy. The Admin Tracking Dashboard (admin_tracker.html
) is a core part of this project, designed purely for security and operational monitoring purposes of the project owner. It allows the project owner to understand the origin of requests interacting with the backend for debugging, security auditing, and preventing misuse.
Please note: When you fork this project for your own use, you become the independent "user project owner." This means that the original project owners (Sayan and Shitij) will NOT have any access to any data you process or store on your fork. Your fork operates entirely independently, and the trust for responsible data handling rests with you. Your project's integrity will be built on the trust you potentially gain from your users.
WE, AS THE ORIGINAL PROJECT OWNERS, NEVER INTEND TO TRACK YOUR USERS' IP ADDRESSES OR THEIR LOCATIONS FOR PERSONAL OR BUSINESS MEANS, NOR WILL WE EVER DO SO. YOUR DATA IS SAFE WITH YOU.
Please be aware of the following:
- When you fork this project for your own use, you become the "user project owner." With this role comes the full responsibility for handling any user data, including IP addresses and geolocation information, in an ethical and lawful manner.
- Backend-Only Data: The IP address and geolocation data collected by the backend are stored locally on the server running
tg.py
(tracking_data.json
). This sensitive user data is NEVER automatically pushed to GitHub or published anywhere externally. It remains on your local machine and is only viewable via theadmin_tracker.html
dashboard when you are logged in as an administrator. - Your Responsibility: As a user of this project, you are provided with tools that could expose sensitive user data if misused. You must be responsible and ensure that you DO NOT publish, share, or use any of this sensitive data (IP addresses, geolocation information) for your personal gain, business purposes, or any other unethical/illegal means. The internet is fast these days; one data leak or misuse can lead to severe consequences, including public exposure and legal action. If we discover any misuse of this sensitive data by you, you will be publicly called out, your access to this project and its related services will be permanently revoked, and appropriate legal action may be pursued. We hope you understand the gravity of this responsibility.
- Limitations of Geolocation Data: It's important to understand that current IP-based geolocation APIs (especially free ones) often provide approximate location data. With dynamic IP addresses common for most internet users, it is generally not possible to get exact, real-time location data or personally identify individuals reliably using this information. The Google Maps and Google Earth links in the dashboard are primarily for visual representation and a "cool" factor, not for precise tracking or surveillance. They genuinely do not provide significant practical help for nefarious purposes.