Software Alternatives & Reviews
Table of contents
  1. Videos
  2. Social Mentions
  3. Comments

Vast.ai

GPU Sharing Economy: One simple interface to find the best cloud GPU rentals. subtitle

Vast.ai Reviews and details

Screenshots and images

  • Vast.ai Landing page
    Landing page //
    2023-10-08

Badges

Promote Vast.ai. You can add any of these badges on your website.
SaaSHub badge
Show embed code

Videos

Using Vast.ai to set up a machine learning server

Social recommendations and mentions

We have tracked the following product recommendations or mentions on various public social media platforms and blogs. They can help you see what people think about Vast.ai and what they use it for.
  • Nvidia pursues $30B custom chip opportunity with new unit
    There are already ways to get around this. For example, renting compute from people who aren't in datacenters. Which is already a thing: https://vast.ai. - Source: Hacker News / 3 months ago
  • A SETI-like project to train LLM on libgen, scihub and the likes?
    By "SETI" I assume you mean the SETI@Home distributed computing project. There's a two-way market where you can rent out your GPU here: https://vast.ai/. - Source: Hacker News / 4 months ago
  • Ask HN: What's the best hardware to run small/medium models locally?
    - https://vast.ai/ (linked by gchadwick above). - Source: Hacker News / 5 months ago
  • Ask HN: What's the best hardware to run small/medium models locally?
    Have you considered running on a cloud machine instead? You can rent machines on https://vast.ai/ for under $1 an hour that should work for small/medium models (I've mostly been playing with stable diffusion so I don't know what you'd need for an LLM off hand). Good GPUs and Apple hardware is pricey. Get a bit of automation setup with some cloud storage (e.g backblaze B2) and you can have a machine ready to run... - Source: Hacker News / 5 months ago
  • Budget-friendly Cloud server to host OpenAI Whisper?
    I have heard vast.ai is cheap but I haven't tried it out. https://websiteinvesting.com/reviews/vast-ai-review/. Source: 5 months ago
  • getting 'fake_useragent' error when trying to load models from civitai browser plus extension
    Hello, I've been having trouble with getting the civitai browser plus extension to retrieve models, this happens on every instance I have tried through vast.ai using the secure cloud gpus, does anyone have any idea as to why this could be? Source: 5 months ago
  • If you use a cloud GPU service to run SD, which one do you prefer right now?
    I used to prefer vast.ai since I could rent instances for less than on Runpod, but now the prices have hiked so much on verified instances that Runpod is now about as much as interruptible instances, and I get much faster download speeds on it. I have no preference, because OTOH interruptible instances can get taken even when you're in the middle of using the GPU, and un-interruptable ones cost ~25% more. I would... Source: 6 months ago
  • Virtual machine Unreal Engine 5 image running really slow
    I'm using the virtual machines on here and have installed the UE 5 custom built image and am on my virtual machine but it's extremely sluggish, my local network is fast and I've had zero issues with latency on vast.ai but on Azure it's incredibly slow - have set my region to the same one I live in aswell - have no idea what's making this so incredibly slow? Source: 7 months ago
  • Need help connecting Oobabooga running on vast.ai with ST
    Hey guys, I need your help what do I enter under Blocking API URL and Streaming API URL while running oobabooga on vast.ai? I tried using the public IP shown on vast.ai with the right port, but it won't connect. Source: 8 months ago
  • AMD users, what token/second are you getting?
    Currently, I'm renting a 3090 on vast.ai, but I would love to be able to run a 34B model locally at more than 0.5 T/S (I've got a 3070 8GB at the moment). So my question is, what tok/sec are you guys getting using (probably) ROCM + ubuntu for ~34B models? Source: 8 months ago
  • Voyages in the Domain of Artificial Ingenuity: Unveiling the Enigma of Stable Diffusion and the Odyssey of Comfy UI
    The crux of my investigations has gravitated chiefly around the concept of Stable Diffusion, with AUTOMATIC1111’s SD Web UI as my conduit for experimentation. My endeavors have transpired within the ethereal confines of cloud computing, leveraging the manifold offerings of Vast AI services. - Source: dev.to / 9 months ago
  • Best cloud provider for cheap, reliable, and flexible SD usage?
    I want to be able to run Kohya, SD webui, comfyui, whatever. I want to be able to SSH into my instance and do whatever I'd like with it. I like vast.ai but unfortunately it has a system which makes it so instances are stuck on "scheduling" when beng used by someone else. It isn't your instance unless you run it 24/7. I want an on-demand instance that is MINE 24/7 regardless of whether or not I'm running it. I'd... Source: 10 months ago
  • ex-ChatGPT newbie question
    You could check out the wiki for this sub, also google vast.ai, there are tutorials. ggml is this: https://github.com/ggerganov/ggml. If you have an specific questions you could ask me, or look it up. Source: 10 months ago
  • What do you do with the GPUs?
    If you have Nvidia cards, you can rent them out via vast.ai, but you will need to upgrade your motherboard to one with more PCIe bandwidth and RAM. Source: 10 months ago
  • Can SD do this?
    As for online/cloud solutions, I highly doubt that anything good would be free. From time to time I'am using vast.ai to run automatic1111, but that is not free. Source: 10 months ago
  • Using SillyTavern with vast.ai or runpod?
    Hi. I've been experimenting with local models on vast.ai , and I was wondering if someone managed to connect local sillytavern to oobabooga running on cloud. I can't get the oobabooga's public_api to run , it says it can't run cloudflared. Source: 10 months ago
  • Free LLM api
    Could use vast.ai , the got docker images available for some of this stuff. Source: 10 months ago
  • Is it possible to rent a service from the cloud but still run KAI from my desktop?
    Yes, you can run KAI in the cloud, for example on runpod.io or vast.ai and then use a local KAI or SillyTavern on your computer to connect to the cloud KAI via API. Runpod has templates to install Kobold AI easily, that might be the fastest way. Source: 10 months ago
  • Recommend me a computer for local a.i for 500 $
    Just keep using what you have, code locally and when you actually need to run something that needs a beefy GPU you rent a machine in the cloud (e.g. vast.ai) for a few hours, get more GPU power than you could ever have locally, and then turn it off (and stop paying) when you don't need it. Source: 10 months ago
  • Fastest way to crack bcrypt $2y$10?
    Vast.ai, if you know anything about the passwords, make the wordlist and use rules on it. Other way is bruteforce but bcrypt is very slow hash. Source: 10 months ago
  • Whats a good pc to buy for local training ?
    I am not experienced with training, but I think that with 24GB VRAM you can create a llama-30b LoRA using QLoRA. If all you want to do is train models, it may be worth it to rent a vast.ai instance or even use Google Colab. Source: 10 months ago

Do you know an article comparing Vast.ai to other products?
Suggest a link to a post with product alternatives.

Suggest an article

Generic Vast.ai discussion

Log in or Post with

This is an informative page about Vast.ai. You can review and discuss the product here. The primary details have not been verified within the last quarter, and they might be outdated. If you think we are missing something, please use the means on this page to comment or suggest changes. All reviews and comments are highly encouranged and appreciated as they help everyone in the community to make an informed choice. Please always be kind and objective when evaluating a product and sharing your opinion.