Hosting images on my Astro blog with Cloudflare R2

29 January 2026
·
blog
astro

Up until now, I have stored all of the images I use on this Astro blog inside of the blog’s GitHub repository. But as I continue to write new blog posts about my hikes in Japan - there’s 99 posts and counting - this approach was definitely not scalable and my blog had grown to store over 5GB worth of images. So a recent project of mine over the new year’s break was to figure out what would be the best alternative approach to take going forward, and what I landed on was using Cloudflare R2.

If you’re running a smaller Astro blog that doesn’t use many images (e.g. one per post) then you definitely don’t need to externally host your images. And if you want to skip the hassle, you can pay for a hosting service that hooks into Astro like Cloudinary. But if your blog is image-heavy and you want to write some fiddly code to save on costs, then this is the post for you.

Why I chose Cloudflare R2

First things first is addressing why I ended up choosing to use Cloudflare R2. As a rough estimate of where I stood:

  • My blog had 5GB worth of images or around 3500 files
  • I upload maybe 100 new images a month
  • My blog gets in the range of 20,000 page views a month (albeit not all of these are on image-heavy hiking pages)
  • Netlify clocks me at using 50-60 GB of bandwidth a month, and I currently get up to 100GB a month free on a grandfathered legacy plan (which I assume at some point they will kick me off of)

Having never used external image hosting before, I wasn’t too sure where to start. It seems they fall into two categories:

1. Image hosting platforms

Your standard image hosting platform generally comes with storage of the image itself plus some built-in image resizing functionality. Basically instead of just rendering a 1920px width image on your blog which is a waste of bandwidth, you can have it dynamically resized to a width closer to what your users are actually viewing it at, e.g. 450px for mobile users.

As a rough overview of the prices:

  • Cloudflare Images: $5/month for 100,000 images + $1 per 100,000 images served
  • ImageKit: free tier is 20GB bandwidth/month (next tier up is $9 a month for 40GB bandwidth)
  • Cloudinary: similarly, free tier is 25GB bandwidth a month, jumps up to $89 for paid tier

Also in the case of Cloudinary with Astro, you get a fancy SDK to make things easier.

Taking a look at my current bandwidth use (50GB/month), I’m definitely not going to be getting on any of the free plans.

2. DIY image hosting

The DIY versions just come with a place to dump your image files, and you have to figure out the rest yourself.

  • Cloudflare R2: $0.015 per GB of storage (first 10GB is free), and $4.50 per million requests (and the first 10 million are free)
  • Backblaze B2: also super cheap, but apparently better for long-term storage rather than a blog where you are constantly fetching and displaying the images
  • Bunny CDN: very cheap at $0.005 per GB of storage (no free tier)

So Cloudflare R2 is way cheaper than using a proper platform, and actually was still free for my usage levels. So this seemed like the most reasonable option, not to mention I already use Cloudflare’s CDN for my blog.

What image optimisation does Astro do?

So if I use Cloudflare R2, I needed to DIY any image optimization or resizing. The first thing was figuring out what sort of optimisation Astro is doing, and then see if I could replicate it.

If for example you store your images at 1920px in your repository, Astro will convert them to .webp and resize them to a more suitable width when users view it. The easiest way to see this is to look at your blog’s source code, and hover over the image - you can see the “intrinsic size” is what Astro has resized it to for you.

The two reasons why this is useful is a) your images load faster and b) it saves on bandwidth usage - no point rendering a 1920px wide image on a 400px wide screen.

Looking at their responsive image docs, it looks like they generate images at 7 different widths, from 640px to 1600px. It then makes use of functionality built-in to the HTML img tag, which chooses the image size to display that suits the browser width that it’s being rendered on.

Rendering different image sizes

I currently have 3 ways images can be viewed on my hiking blog:

  • Mobile users: around 400px-ish in width, depending on the mobile
  • Desktop browsers: my images cap out at 800px width as the content of my posts don’t extend beyond that
  • Desktop lightbox: if you click on the image you can also view it at a size as large as your screen goes.
An image in lightbox mode

Generally I find that if I resize my images to 800px and then display them at that width, they don’t look quite as crisp as I would like them to - so I generally just resize my images to 1200px, and then display them at 800px.

I learnt that this is due to what is known as the device-pixel-ratio (DPR), where displays like my Mac offer higher pixel density.

So I decided I wanted to store my images at 3 different sizes - 640, 1200 and 1920px to account for the three different use-cases. And I would also need to make sure to transform my images into the .webp format to save on bandwidth.

Using Astro’s build step

As a side note, it seems like it may be possible to hook into Astro’s build step, so Astro will still do all of the image resizing for you, and then you can upload them to R2.

This might be a more reasonable option for a new blog, but I didn’t want to attempt to get Astro to mass-generate images at so many varying sizes for all of my blog posts in one go (that’s a lot of images). And so I opted to do it with a script, in batches with my new posts over time.

I’m not sure that a script is exactly the best approach - maybe the build step would have turned out better, so your results may vary.

Rendering my images in my Astro blog

Up until now I’d been using markdown to render images in my blog posts like:

![](image-file.jpeg)

Since I wanted to keep my Markdown files as loosely tied to Astro as possible (in case I want to move off to another platform in the future).

However if I want to make use of responsive images, using a component seems to be the only way, since I need to pass in the two different width images for uses at different browser widths.

So I made a new ImageCdn component to handle this:

ImageCdn.astro
const mobileSrc = `${CLOUDFLARE_URL}/${imageUrl}-${MOBILE_SIZE}w.webp`;
const desktopSrc = `${CLOUDFLARE_URL}/${imageUrl}-${DESKTOP_SIZE}w.webp`;
 
// Not the full code, but as an example
<figure class="rehype-figure-title">
    <picture>
        <source media="(max-width: 799px)" srcset={mobileSrc} />
        <source media="(min-width: 800px)" srcset={desktopSrc} />
        <img 
            src={desktopSrc} 
            alt={alt}
            width={width}
            height={height}
            data-full-src={fullSizeSrc}
            data-is-imagecdn="true"
            class="lightbox-image"
            loading="lazy"
        />
    </picture>
    <figcaption>{caption}</figcaption>
</figure>

And then in my blog posts, I can use it like:

<ImageCdn src='file-name' />

And the component will be responsible for generating the correct Cloudflare image URL from R2 to render.

Inferring image sizes

The other benefit that Astro provides is it leaves space for the height of the image before it loads in - otherwise you have to deal with the jerkiness of images popping in, shifting around the content around it. Astro will automatically infer image sizes for all the images that are stored locally.

However if we are storing images externally, it seemed that maybe we could use a prop called inferSize to get Astro to figure out the size of the image for me. But when I tried it, it caused build issues since it seemed to be inferring the size for all the images each time I did a build (which of course also causes a lot of GET calls to Cloudflare which is not good either).

So I opted to just pass in the image sizes when I use the ImageCdn component like so:

<ImageCdn file="horses" alt="" width={3024} height={2488} />

This can be a bit annoying, but it’s not as bad as it seems, since most of photos on my hiking blog are the same size (taken on my Fujifilm). In that case I use the logic where I go “if no width/height is used, use the Fujifilm photo dimensions as the default”. And so it’s only for the differently-sized images (like ones taken on my phone, or vertical images) that I need to explicitly pass in the size.

Dealing with cover images

Another finicky part of this setup were the cover images. This is the top image shown underneath the title of each blog post like here, and also shown as a preview image on other places on my blog.

Although most of the images in my hiking blog posts are always shown at the same size (max 800px width) the cover images are unique, as they can also be shown at tiny 100 or 200px widths e.g. on my homepage’s feed.

At first I tried to figure out the best way to make use of responsive images here. One alternative would be to fall back to the 1200px width cover image anywhere on the site, which would be a tiny bit of a waste of bandwidth since I only actually needed an image that was 200px wide.

To not overcomplicate things, the solution I ended up going with was just keeping the cover image stored alongside each blog post:

hiking
    mt-jinba
        index.mdx
        coverimage-1920w.webp

So I get Astro’s responsive image-sizing for free. I save a tiny bit on bandwith costs, and the downside is I have to store 1 extra image per post inside my blog itself, but I think that’s a fair tradeoff.

Getting started with Cloudflare R2

So now we finally get to the Cloudflare part of the post. To be honest, this is fairly straightforward, as you are just uploading your files, and I already had a Cloudflare account with my domain setup. As a brief overview, first I went to my DNS > Records page and added a new cdn.emgoto.com subdomain:

(The placeholder value can be anything, as it will be overridden in the next step).

Then I go to R2 Overview > Settings, and add this URL as a custom domain.

This domain setup means my image URLs will look something like:

https://cdn.emgoto.com/hiking/mt-takao/DSCF8585-1920w.webp

Uploading images to Cloudflare R2

So I used the very-handy Cursor AI to generate the script to upload my images to Cloudflare. For each image, I would upload the following variants:

  • mt-jinba.jpeg
  • mt-jinba-1920w.webp
  • mt-jinba-1200w.webp
  • mt-jinba-640w.webp

I don’t actually use the jpeg file anywhere, but it’s just for safe-keeping in case I need it later.

Cloudflare doesn’t technically have folders, but if you add slashes to file names like /hiking/mt-jinba/cover-image.jpeg, then when you browse the images in the UI, it will show up as as folders.

I’m not of the opinion that AI is always really good at writing code, but I tend to find it at least does an okay job with scripts. There was still a lot of trial-and-error, where I would let it loose on one image or one folder at a time, and then tweak the script depending on what it did wrong.

I don’t really want to share the AI slop code here on this blog, but as an overview of what it does, it first converts the JPEG file to a webp file:

await sharp(inputPath)
  .resize({ width, withoutEnlargement: true })
  .webp({ 
    quality: 80,
    effort: 4,
    smartSubsample: true,
  })
  .toFile(outputPath);

And then makes copies at all the different sizes, before uploading the images to Cloudflare R2:

const s3Client = new S3Client({
	region: 'auto',
	endpoint: `https://${R2_ACCOUNT_ID}.r2.cloudflarestorage.com`,
	credentials: {
		accessKeyId: R2_ACCESS_KEY_ID,
		secretAccessKey: R2_SECRET_ACCESS_KEY,
	},
});
 
// ...
 
const fileContent = await readFile(filePath);
const command = new PutObjectCommand({
	Bucket: R2_BUCKET_NAME,
	Key: `${R2_PATH}/${fileName}`,
	Body: fileContent,
	ContentType: 'image/webp',
});
 
await s3Client.send(command);

Updating the blog post content

When I initially write my blog posts in Obsidian, they are originally written as plain markdown, with the images defined like ![](image-name.jpeg). So as a final step I also have the script go through and find all the Markdown images, and convert them to the <ImageCdn /> format. It will also grab the widths/heights of the images, and pass them in if required.

After I write a new blog post, I just need to run my script, and it will go through and upload all the images, and update the blog post content for me. I then have to do a quick pass-through of the post itself to double-check that all the images uploaded correctly.

Occasional image uploads not working

The one pain point I have is that sometimes in the Cloudflare UI, it looks like the upload has worked:

But when I try and view the image by its URL directly, the page gives me a 404:

In that case, I just delete the file (and/or rename it slightly) and then re-upload it again.

The results

I slowly transferred all of my hiking images across to Cloudflare R2 in the first half of January.

I’m currently storing just under 9GB in Cloudflare R2, and have had 133k “Class B” operations (which is what counts as an image load). I suppose I will hit the 10GB free storage limit at some point, from which time I will have to pay an extra 1.5c per GB. And I am nowhere near hitting the 10 million free tier for loading page views.

Over on Netlify, my daily bandwidth usage has dropped to about a quarter of its previous levels.

And over that same time period, my page view analytics have been pretty stable (and maybe even gone up a little bit)

So overall, things were a success.

Having an Astro blog with so many photos on it is a pretty niche problem to have, and I honestly didn’t know where to start. So I hope you found this post useful if you need to do something similar, or maybe this can help discourage you onto trying something else if you think this all sounds like too much of a pain.

It is a bit of a fragile setup, but I think having the AI tools to help write the scripts made it a lot smoother (albeit it still took a lot longer to get working than I would have liked!)

Recent posts

Comments