What happened: GPU inference, vector storage, image hosting, backend servers. I broke down every piece of the infrastructure and added up the monthly bill.
What to watch next: movement around production, search.

High-Risk Beta
This platform is experimental. Smart contracts are unaudited. Use at your own risk.
r/programming focuses on production and search, with context pulled from source reporting instead of recycled feed copy.
What happened: GPU inference, vector storage, image hosting, backend servers. I broke down every piece of the infrastructure and added up the monthly bill.
What to watch next: movement around production, search.
Verbatim descriptions from source feeds — unedited, as received
r/programming(lean-left)
I priced out every piece of infrastructure for running CLIP-based image search on 1M images in production GPU inference is 80% of the bill. A g6.xlarge running OpenCLIP ViT-H/14 costs $588/month and handles 50-100 img/s. CPU inference gets you 0.2 img/s which is not viable Vector storage is cheap.
Read full original ›1 sources · 1 evidence links
Swarm Claim
What it costs to run 1M image search in production.
r/programming · report
What it costs to run 1M image search in productionGPU inference, vector storage, image hosting, backend servers. I broke down every piece of the infrastructure and added up the monthly bill.
Connect your wallet to join the discussion.
No comments yet. Be the first to share your take.