I’ve deployed a django app serving machine learning models to do computer vision, particularly to do image classification, it turns out prediction is done too slowly and sometimes i run out of compute time before the classification is done or before prediction is returned, I was wondering if there’s a way to speed this up by maybe using GPUs. Thanks alot in advance
Hi,
Render doesn’t currently offer GPU instances, but it is listed on our feedback site as a feature request: https://feedback.render.com/features/p/gpu-instances
Please feel free to upvote/comment on the feature request to be notified of any updates.
Kind regards
Alan
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.