Python Tor 主機代管 — Django, Flask & FastAPI on .onion
Run your Python web applications as Tor 隱藏服務 with AnubizHost. Our Python 主機代管 supports Django, Flask, FastAPI, and any WSGI/ASGI framework with Gunicorn or Uvicorn, PostgreSQL, Redis, and automated 部署 — all accessible exclusively through your .onion address.
Need this done for your project?
We implement, you ship. Async, documented, done in days.
Python Web Framework Support
AnubizHost's Python Tor 主機代管 supports every major Python web framework. Django deployments include Gunicorn as the WSGI server with Nginx reverse proxy, PostgreSQL for the 資料庫, and Redis for caching and Celery task queues. We configure Django settings for production security, including proper SECRET_KEY 管理, ALLOWED_HOSTS set to your .onion address, and secure cookie settings.
Flask and FastAPI applications are equally well-supported. Flask apps run behind Gunicorn with gevent or eventlet workers for concurrent request handling. FastAPI applications use Uvicorn with ASGI for native async support, delivering excellent performance for API-heavy workloads. Both frameworks benefit from our Nginx front-end for static file serving and connection 管理.
We also support less common frameworks like Pyramid, Tornado, Sanic, and Starlette. If it runs on Python, it runs on our Tor 主機代管. Provide a requirements.txt or Pipfile and a WSGI/ASGI entry point, and we configure the rest. Virtual environments are used for every 部署 to prevent dependency conflicts.
Development and 部署 Workflow
部署 Python applications via our Tor-accessible Git 主機代管. Push your code, and our 部署 system creates a fresh virtual environment, installs dependencies from your requirements file, runs 資料庫 migrations, collects static files, and restarts Gunicorn. All 部署 steps are logged and visible in your control panel for debugging.
We support pyproject.toml, setup.cfg, requirements.txt, Pipfile, and Poetry lock files for dependency 管理. Pip packages are downloaded through Tor to prevent clearnet exposure. We maintain a PyPI mirror cache that significantly speeds up common package installations — popular packages like Django, Flask, and their dependencies install in seconds from the local cache.
For reproducible deployments, we recommend using pip-compile to generate pinned requirements files with exact version numbers and hashes. This ensures that every 部署 installs identical dependencies regardless of when it runs. Our 部署 system verifies package hashes automatically when they are present, catching any tampering during download.
Background Tasks and Scheduling
Most Python web applications need background task processing. Our 主機代管 includes Celery with Redis as the message broker, configured to run alongside your web application. Define your tasks in your Django or Flask app, and they execute asynchronously in Celery worker processes. Beat scheduling handles periodic tasks like 資料庫 maintenance, report generation, and data aggregation.
For simpler scheduling needs, we configure systemd timers or cron jobs that run Python scripts at specified intervals. These are useful for tasks that do not need the full Celery infrastructure — 資料庫 backups, log processing, health checks, or automated content updates. Timer definitions are version-controlled alongside your application code.
Long-running processes like data scrapers (over Tor), blockchain node synchronization, or machine learning training jobs can run as separate systemd services with their own process 管理 and logging. PM2 alternative solutions like Supervisor are also available for managing multiple Python processes with automatic restart and resource limits.
Data Science and Machine Learning
Python's dominance in data science and machine learning makes our Tor 主機代管 an excellent platform for 隱私-preserving analytics and AI services. Install NumPy, pandas, scikit-learn, PyTorch, and TensorFlow from our cached PyPI mirror. Serve machine learning models behind a FastAPI endpoint accessible only through your .onion address.
Jupyter notebooks can be hosted as a Tor 隱藏服務 for private data exploration and collaboration. Access your notebook server through Tor 瀏覽器 with token-based authentication. This setup is valuable for research teams handling sensitive data who need a collaborative analysis environment without exposing their work to the clearnet.
For GPU-accelerated workloads, our 專用伺服器 plans offer NVIDIA GPU options with CUDA support. Train models on private data sets without sending your data to cloud providers. Combined with Tor's anonymity properties, this enables truly private machine learning — neither the training data nor the model's existence is visible to outside observers.
Related Services
Why Anubiz Host
Ready to get started?
Skip the research. Tell us what you need, and we'll scope it, implement it, and hand it back — fully documented and production-ready.