No GPU.LAND videos yet. You could help us improve this page by suggesting one.
Based on our record, Vast.ai seems to be a lot more popular than GPU.LAND. While we know about 223 links to Vast.ai, we've tracked only 8 mentions of GPU.LAND. We are tracking product recommendations and mentions on various public social media platforms and blogs. They can help you identify which product is more popular and what people think of it.
I'm just going to mention here the experience of someone who ran gpu.land (doesn't exist any more). He did something similar, monetized it (very cheap) and then had to shut down because people were running crypto miners on it. I hope you have a plan to avoid that type of abuse. Source: about 2 years ago
RIP to gpu.land... I was hoping they would take off because they seemed to have a cool product with great pricing. Source: almost 3 years ago
There's also https://gpu.land (which has their own comparison page). Source: almost 3 years ago
Heya, I'm also so just keeping in touch. After liek 1 month of non redditing, someone replied who claimed to be the developer of gpu.land Apparently it is cloud computing for full Linux rather than the Jupyter notebook like what we tried before. Can I ask what is the update on the cloud computing site? I messaged the gpu.land person to see if we can get some free trial ($1 per hour on cheapest one but I don't know... Source: about 3 years ago
There are also more affordable GPU-for-DL-lending options like gpu.land, although I have never used them so I can't vouch for them -- just something I saw on PH. Source: about 3 years ago
There are already ways to get around this. For example, renting compute from people who aren't in datacenters. Which is already a thing: https://vast.ai. - Source: Hacker News / 3 months ago
By "SETI" I assume you mean the SETI@Home distributed computing project. There's a two-way market where you can rent out your GPU here: https://vast.ai/. - Source: Hacker News / 4 months ago
- https://vast.ai/ (linked by gchadwick above). - Source: Hacker News / 5 months ago
Have you considered running on a cloud machine instead? You can rent machines on https://vast.ai/ for under $1 an hour that should work for small/medium models (I've mostly been playing with stable diffusion so I don't know what you'd need for an LLM off hand). Good GPUs and Apple hardware is pricey. Get a bit of automation setup with some cloud storage (e.g backblaze B2) and you can have a machine ready to run... - Source: Hacker News / 5 months ago
I have heard vast.ai is cheap but I haven't tried it out. https://websiteinvesting.com/reviews/vast-ai-review/. Source: 5 months ago
Banana.dev - Banana provides inference hosting for ML models in three easy steps and a single line of code.
iExec - Blockchain-Based Decentralized Cloud Computing.
Apple Core ML - Integrate a broad variety of ML model types into your app
Amazon AWS - Amazon Web Services offers reliable, scalable, and inexpensive cloud computing services. Free to join, pay only for what you use.
TensorFlow Lite - Low-latency inference of on-device ML models
Golem - Golem is a global, open sourced, decentralized supercomputer that anyone can access.