Port Scan Timeout Error - No Open Ports Detected on Deploy

Port Scan Timeout Error - No Open Ports Detected on Deploy


Body (متن سوال):

Hi Render Community,

I’m deploying a Python-based FastAPI project on Render. The deployment process appears to complete successfully, but I keep encountering the following error in the logs:

==> No open ports detected, continuing to scan...
==> Docs on specifying a port: https://render.com/docs/web-services#port-binding
==> Port scan timeout reached, no open ports detected. Bind your service to at least one port. If you don't need to receive traffic on any port, create a background worker instead.

What I Have Tried:

  1. Ensuring my application listens on the PORT environment variable:
    import os
    import uvicorn
    
    if __name__ == "__main__":
        port = int(os.environ.get("PORT", 10000))
        uvicorn.run("main:app", host="0.0.0.0", port=port)
    
  2. Verified that the PORT environment variable is correctly set in Render’s settings.
  3. Checked my code for any potential issues and ensured that the FastAPI application should start as expected.

My Observations:

  • The application logs indicate successful initialization of my database, Telegram Bot, and other components.
  • Despite this, Render seems unable to detect any open ports.

My Questions:

  1. What could cause Render to fail in detecting the open port, even though the application runs without issues locally?
  2. Should I consider converting this service to a Background Worker if it primarily handles Telegram Bot functions using Webhooks or Long Polling?
  3. Are there any Render-specific configurations I might be missing?

Project Overview:

  • Type: FastAPI + Telegram Bot
  • Port Configuration: Uses os.environ.get("PORT", 10000) and binds to 0.0.0.0.
  • Dependencies: Uvicorn, FastAPI, python-telegram-bot, motor (MongoDB).

Any guidance or suggestions would be greatly appreciated. Thanks in advance for your help!

I am facing the same problem. Are you able to solve your issue? If so could you please share it with me?