Raspberry Pi 5 8GB current price in 2026

The bare board has stayed at $80 USD MSRP since launch. Here is the verified number, where the cost actually goes once you build a working unit, and what an 8GB Pi 5 can and cannot do if you are eyeing it as a local AI agent host.

M
Matthew Diakonov
6 min read
Direct answer, verified 2026-05-14

$80 USD MSRP for the 8GB board.

Street prices at authorized resellers in May 2026 range from $80 to about $95 once shipping and local tax are folded in. The authoritative source is raspberrypi.com/products/raspberry-pi-5. For region-specific listings, check the approved reseller list.

Where the cost actually goes

$80 is the bare-board price. Almost nobody runs a Pi 5 as a bare board. A usable setup adds parts that the price tag does not include, and the total lands a lot closer to $150 than $80.

ComponentTypical 2026 priceNotes
Pi 5 8GB board$80MSRP, board only
Official 27W USB-C PSU$12The Pi 5 will throttle on underpowered USB-C bricks
Active cooler + case$10 to $15Required for sustained workloads
microSD card (64GB A2)$10Or NVMe HAT plus SSD for $40 to $60
Micro-HDMI cable$5Only if headed
Working unit total$117 to $122Starter kits bundle this at $130 to $170

Has the price moved since launch?

Not in any sustained way. The Pi 5 launched in October 2023 at $80 for the 8GB. Stock was tight through early 2024 and resellers asked premiums. By mid-2024 inventory normalized and the board went back to MSRP at most authorized resellers. The 16GB variant launched in January 2025 at $120 and briefly stirred rumors that the 8GB was on the way out, but Raspberry Pi Ltd kept it in the lineup.

The pattern since: steady MSRP, with mild Pi Day and Black Friday discounts, and occasional regional spikes when shipping gets weird. If you see a 30% premium on a reseller listed on raspberrypi.com in mid-2026, it is most likely a bundle you are looking at, not the bare board.

Can an 8GB Pi 5 host a local AI coding agent?

This is the question most readers searching this query are actually trying to answer, so it is worth being precise. The honest answer is: for non-LLM agent work the Pi 5 is great, for on-device language model inference at any productive size it is not the right machine.

With 8GB of LPDDR4X-4267 memory and no discrete GPU, a Pi 5 can host quantized 3B-class models (Phi-3 Mini, Qwen2.5 1.5B) on CPU at single-digit tokens per second. That is fine for demo loops and embedded use cases (a sensor that classifies sound, a router that summarizes log lines). It is not fine for pair-programming against a real codebase, where you need 7B to 13B class models running at 15+ tokens per second with a useful context window.

Where a Pi 5 8GB does shine for agent workloads: as an always-on process host. MQTT brokers, scheduled scrapers, webhook relays, home-automation bridges that call remote LLM APIs, a tiny local vector DB for retrieval. The board is reliable, sips 5-12W under load, and at $80 it is the cheapest serious Linux box you can buy. The agent loop runs on the Pi, the model lives in the cloud, and you get most of the value without paying for a workstation.

If your goal is to run the coding agent locally for privacy or offline reasons, the price-to-performance floor in 2026 is an Apple Silicon Mac with at least 16GB of unified memory. A used M1 Mac mini 16GB sits around $400-$500 on the secondary market and runs 13B quantized models comfortably with MLX. That is a 5x outlay against the Pi, and it is a different machine for a different job. We have written about that tradeoff in more detail elsewhere.

A note on Fazm and Pi 5

For full transparency: Fazm is the macOS app this site is about. It wraps Claude Code and Codex via ACP in a native Mac UI with persistent sessions, one-click chat forking, and no auto-compacting of context. It does not run on Raspberry Pi (it requires macOS 14 or newer and uses macOS accessibility APIs to drive native Mac apps and the browser). If you are shopping for a Pi 5 because you want to spin up a small Linux box for tinkering, that is the right tool for the job and Fazm is not relevant. If you are shopping for a Pi 5 because you want to run a serious local coding agent, the Pi is the wrong shape of machine, and an Apple Silicon Mac is where Fazm and the wider Apple Silicon local AI ecosystem live.

Where to verify the number yourself

Picking hardware for an AI agent workflow?

If you are weighing a Pi 5 against a Mac mini against a cloud agent, a 20 minute call can save you a bad $500 purchase.

Frequently asked questions

What is the current Raspberry Pi 5 8GB price in 2026?

The official MSRP set by Raspberry Pi Ltd is $80 USD for the 8GB board only. That number has held since the Pi 5 launched in October 2023 and the company has not announced a change as of May 2026. Authorized resellers (CanaKit, PiShop, The Pi Hut, OKdo, PiMoroni) list it between $80 and $95 once you include shipping and local tax. The number to verify against is the price on raspberrypi.com/products/raspberry-pi-5, which is the canonical source.

Why is the street price often higher than $80?

Two reasons. First, $80 is the bare-board price, and a working setup needs the official 27W USB-C power supply ($12), a microSD card or NVMe HAT plus SSD ($10-$60), a case with the active cooler ($10-$15), and an HDMI cable ($5). A starter kit bundling those parts typically lands at $130-$170. Second, regional VAT, customs, and reseller margin push the unit price above MSRP outside the US and UK, especially in the EU and APAC. The board itself is still $80, but very few people pay only that.

Has the Pi 5 8GB price dropped or risen since launch?

Neither in any sustained way. After the 2023 launch shortages cleared in mid-2024, the board has been at MSRP at most authorized resellers. There were brief premium spikes in late 2024 when the 16GB variant was announced and the 8GB was rumored to be discontinued (it was not), and brief discounts during Black Friday and Pi Day each year, but the steady-state price for an 8GB board has been $80 USD since stock normalized.

Is the 8GB worth it over the 4GB or 2GB?

It depends entirely on the workload. For headless server use (Pi-hole, Home Assistant, NAS controller, light Docker), 4GB at $60 is plenty and the $20 saving compounds across a fleet. For desktop use, browsing with multiple Chromium tabs, or anything touching local language models or vector databases, the 8GB version is the floor and the 16GB at $120 is the better long-term buy. The 8GB sits in the sweet spot for general-purpose tinkerers who are not sure yet.

Can a Raspberry Pi 5 8GB run a local AI coding agent?

Technically yes for very small models, practically no for anything resembling a productive coding loop. With 8GB of LPDDR4X-4267 and no discrete GPU, you can run quantized 3B-class models (Phi-3 Mini, Qwen2.5 1.5B) at single-digit tokens per second on CPU. That is enough to demo, not enough to pair-program against a real codebase. The Pi 5 is excellent as an always-on agent host for non-LLM tasks (sensor polling, MQTT, scheduled scrapers, mqtt-to-LLM bridges that proxy a remote model), but the on-device inference performance is a real bottleneck. For local coding agents on consumer hardware in 2026, an Apple Silicon Mac with 16GB+ unified memory is the price-to-performance floor.

Where do I check that the price hasn't moved since this page was written?

Three places, in order of authority. First, raspberrypi.com/products/raspberry-pi-5 for the MSRP. Second, the approved reseller list at raspberrypi.com/resellers for region-specific street prices. Third, rpilocator.com which scrapes inventory and pricing across resellers in near real time and flags when stock or price drifts. If you see a price 30% above $80 on any reseller listed on raspberrypi.com, something is off (shortage, gray market, or you are looking at a bundle).

What about the 16GB Pi 5? Is it worth the jump?

The 16GB variant launched in early 2025 at $120 USD. For desktop use, local model experimentation, or anything where you would rather buy memory once than twice, the $40 step up is the cleaner buy in 2026. The 8GB still wins on price per board for fleet deployments and headless servers where the memory ceiling does not matter.

How did this page land for you?

React to reveal totals

Comments ()

Leave a comment to see what others are saying.

Public and anonymous. No signup.